Algebra is one of the major branches of mathematics, whose study involves the manipulation of equations, mathematical operations, and polynomials. The study of the matrices, is part of this branch and although they have existed for some time, it was only from the 19th century, that its use began to intensify. Currently, one of the main applications of the matrices is to find the solutions of a system of linear equations with `n` variables and `n` equations. When the value of `n` is small (2 or 3), it is easy to solve the system either by the substitution method or by the addition (elimination) method. For `n` values greater than 3, the use of matrices allows finding solutions faster.

Designation | Example | Description |
---|---|---|

Square matrix | A matrix is said to be square if the number of rows is equal to the number of columns. | |

Triangular matrix | A square matrix is said to be triangular if all of its elements below the main diagonal are zero or all of its elements above the main diagonal are zero. | |

Diagonal matrix | Unlike the previous one, in this matrix all the elements above and below the main diagonal must be equal to zero. | |

Identity matrix | A diagonal matrix in which all of the diagonal elements are equal to one. | |

Null matrix | The matrices whose value of all elements is equal to zero. | |

Row matrix | A matrix is said to be a row matrix if it has only one row. | |

Column matrix | Similar to the previous one, but in this case, the matrix has only one column. |

**Transposed Matrix** is given to a matrix that is obtained from another, but through a "ordered" exchange of its lines through its columns. From this definition, it follows that the transposed matrix of a transposed matrix, results in the original matrix, that is, `(A^T)^T = A`. Observe how this double transposition happens in this small animation.