Goto Chapter: Top 1 2 3 4 A Bib Ind
 [Top of Book]  [Contents]   [Previous Chapter]   [Next Chapter] 

2 Extending Gauss Functionality
 2.1 The need for extended functionality
 2.2 The applications of the Gauss package algorithms

2 Extending Gauss Functionality

2.1 The need for extended functionality

GAP has a lot of functionality for row echelon forms of matrices. These can be called by SemiEchelonForm and similar commands. All of these work for the GAP matrix type over fields. However, these algorithms are not capable of computing a reduced row echelon form (RREF) of a matrix, there is no way to "Gauss upwards". While this is not neccessary for things like Rank or Kernel computations, this was one in a number of missing features important for the development of the GAP package homalg by M. Barakat [Bar20].

Parallel to this development I worked on SCO [Gör08b], a package for creating simplicial sets and computing the cohomology of orbifolds, based on the paper "Simplicial Cohomology of Orbifolds" by I. Moerdijk and D. A. Pronk [MP99]. Very early on it became clear that the cohomology matrices (with entries in ℤ or finite quotients of ℤ) would grow exponentially in size with the cohomology degree. At one point in time, for example, a 50651 x 1133693 matrix had to be handled.

It should be quite clear that there was a need for a sparse matrix data type and corresponding Gaussian algorithms. After an unfruitful search for a computer algebra system capable of this task, the Gauss package was born - to provide not only the missing RREF algorithms, but also support a new data type, enabling GAP to handle sparse matrices of almost arbritrary size.

I am proud to tell you that, thanks to optimizing the algorithms for matrices over GF(2), it was possible to compute the GF(2)-Rank of the matrix mentioned above in less than 20 minutes with a memory usage of about 3 GB.

2.2 The applications of the Gauss package algorithms

Please refer to [ht22] to find out more about the homalg project and its related packages. Most of the motivation for the algorithms in the Gauss package can be found there. If you are interested in this project, you might also want to check out my GaussForHomalg [Gör08a] package, which, just as RingsForHomalg [BGKL08] does for external Rings, serves as the connection between homalg and Gauss. By allowing homalg to delegate computational tasks to Gauss this small package extends homalg's capabilities to dense and sparse matrices over fields and rings of the form \(ℤ / \langle p^n \rangle\).

For those unfamiliar with the homalg project let me explain a couple of points. As outlined in [BR08] by D. Robertz and M. Barakat homological computations can be reduced to three basic tasks:

In addition to these tasks only relatively easy tools for matrix manipulation are needed, ranging from addition and multiplication to finding the zero rows in a matrix. However, to reduce the need for communication it might be helpful to supply homalg with some more advanced procedures.

While the above tasks can be quite difficult when, for example, working in noncommutative polynomial rings, in the Gauss case they can all be done as long as you can compute a Reduced Row Echelon Form. This is clear for BasisOfRowModule, as the rows of the RREF of the matrix are already a basis of the module. EchelonMat (4.2-1) is used to compute RREFs, based on the GAP internal method SemiEchelonMat for Row Echelon Forms.

Lets look at the second point, the basic function DecideZeroRows: When you face the task of reducing a module \(A\) with a given basis \(B\), you can compute the RREF of the following block matrix:

Id A
0 B

By computing the RREF (notice how important "Gaussing upwards" is here) \(A\) is reduced with \(B\). However, the left side of the matrix just serves the single purpose of tricking the Gaussian algorithms into doing what we want. Therefore, it was a logical step to implement ReduceMat (4.2-3), which does the same thing but without needing unneccessary columns.

Note: When, much later, it became clear that it was important to compute the transformation matrices of the reduction, ReduceMatTransformation (4.2-4) was born, similar to EchelonMatTransformation (4.2-2). This corresponds to the homalg procedure DecideZeroRowsEffectively.

The third procedure, SygygiesGeneratorsOfRows, is concerned with the relations between rows of a matrix, each row representing a module element. Over a field these relations are exactly the kernel of the matrix. One can easily see that this can be achieved by taking a matrix

A Id

and computing its Row Echelon Form. Then the row relations are generated by the rows to the right of the zero rows of the REF. There are two problems with this approach: The computation diagonalizes the kernel, which might not be wanted, and, much worse, it does not work at all for rings with zero divisors. For example, the \(1 \times 1\) matrix \([2 + 8ℤ]\) has a row relation \([4 + 8ℤ]\) which would not have been found by this method.

Approaching this problem led to the method EchelonMatTransformation (4.2-2), which additionally computes the transformation matrix \(T\), such that RREF \(= T \cdot M\). Similar to SemiEchelonMatTransformation, \(T\) is split up into the rows needed to create the basis vectors of the RREF, and the relations that led to zero rows. Focussing on the computations over fields, it was an easy step to write KernelMat (4.2-5), which terminates after the REF and returns the kernel generators.

The syzygy computation over \(ℤ / \langle p^n \rangle\) was solved by carefully keeping track of basis vectors with a zero-divising head. If, for \( v = (0,\ldots,0,h,*,\ldots,*), h \neq 0,\) there exists \(g \neq 0\) such that \(g \cdot h = 0\), the vector \(g \cdot v\) is regarded as an additional row vector which has to be reduced and can be reduced with. After some more work this allowed for the implementation of KernelMat (4.2-5) for matrices over \(ℤ / \langle p^n \rangle\).

This concludes the explanation of the so-called basic tasks Gauss has to handle when called by homalg to do matrix calculations. Here is a tabular overview of the current capabilities of Gauss (\(p\) is a prime, \(n \in ℕ\)):

Matrix Type: Dense Dense Sparse Sparse Sparse
Base Ring: Field \(ℤ / \langle p^n \rangle\) Field GF(2) \(ℤ / \langle p^n \rangle\)
RankMat GAP n.a. + ++ n.a.
EchelonMat + - + ++ +
EchelonMatTransf. + - + ++ +
ReduceMat + - + ++ +
ReduceMatTransf. + - + ++ +
KernelMat + - + ++ +

As you can see, the development of hermite algorithms was not continued for dense matrices. There are two reasons for that: GAP already has very good algorithms for ℤ, and for small matrices the disadvantage of computing over ℤ, potentially leading to coefficient explosion, is marginal.

 [Top of Book]  [Contents]   [Previous Chapter]   [Next Chapter] 
Goto Chapter: Top 1 2 3 4 A Bib Ind

generated by GAPDoc2HTML