In short, matrix decomposition is the factorization of a matrix into the product of matrices. It is a fundamental part of linear algebra and applied statistics. Its main purpose is the computational convenience it provides and its analytic simplicity. Data matrices such as proximity or correlation matrices are often very hard and tedious to analyze, so when we want to reveal the characteristics and structure of the matrix we have started from we need to decompose the data matrix into lower-order or lower-rank canonical forms.

The decomposition of matrices helps us solve large systems of linear equations on a computer or help us diagonilize a large matrix onto a computer. The reason these problems are so incredibly difficult to solve is because the gap between the theoretical results and the actual computations is very large. This large gap is caused by rounding errors which occur on the computer.

Implemented in Matdeck are a group of functions specialized for the decomposition of matrices which can help you solve problems like the ones we mentioned above.

### Decompositions that are covered in MatDeck are:

Cholesky decomposition

Cholesky LDP decomposition

Hessenberg decomposition

LU decomposition

LU with permutation matrices

QR decomposition

QR decomposition with permutation matrix

SVD decomposition

Diagonalization

The concept behind the functions that are implemented in MatDeck for the mentioned decompositions is to have one main function that will return the vector with all the matrices of the mentioned decomposition and also have separate functions that return the matrix of the same decomposition one at a time.