Andrew Thangaraj
Aug-Nov 2020
\(A:n\times n\) matrix written as \(A=[v_1;\ldots;v_n]\), where \(v_j\) is the \(j\)-th row of \(A\).
\(v_j\): row vector of length \(n\)
det: \(F^{n,n}\to F\) is a function from square matrices to the field \(F\) satisfying the following conditions or defining properties:
Identity: det\((I)=1\)
Row scaling: det\(([v_1;\ldots;cv_j;\ldots,v_n])=c\) det\(([v_1;\ldots;v_j;\ldots;v_n])\)
Row linearity: det\(([v_1;\ldots;v_j+v'_j;\ldots;v_n])=\) det\(([v_1;\ldots;v_j;\ldots;v_n])+\) det\(([v_1;\ldots;v'_j;\ldots;v_n])\)
Equal rows: det\(([v_1;\ldots;v_n])=0\), if any two rows are equal
why such an intricate definition?
is there such a function?
\(A=\begin{bmatrix} a&b\\ c&d \end{bmatrix}\), det\((A)=ad-bc\)
\(A=\begin{bmatrix} a&b&c\\ d&e&f\\ g&h&i \end{bmatrix}\), det\((A)=aei+bfg+cdh-gec-hfa-idb\)
\(A:n\times n\) matrix, det\((A)=?\)
Zero row: det\(([v_1;\ldots;v_n])=0\), if any of the rows is equal to all zeros
Proof
Row operation: det\(([v_1+cv_j;\ldots;v_j;\ldots;v_n])=\text{det}([v_1;\ldots;v_j;\ldots;v_n])\)
Proof
Dependent rows: det\(([v_1;\ldots;v_n])=0\), if the rows or columns are linearly dependent
Proof
Row swap: If two rows are interchanged, determinant gets multiplied by \(-1\)
Proof
Elementary row operators: Let \(E\) be an elementary row operator \[\text{det}(E)=\begin{cases} c&\text{ if row scaling by }c\\ -1&\text{ if row swap}\\ 1&\text{ if row }i = \text{row }i + c(\text{row }j) \end{cases}\]
Product of elementary row operators and a matrix \[\text{det}(EA)=\text{det}(E)\text{det}(A)\]
\[\text{det }\left(\left(\prod_i E_i\right) A\right) = \left(\prod_i \text{det}(E_i)\right)\text{det}(A)\]
Diagonal matrix: det \(\begin{bmatrix}d_1&0&\cdots&0\\ 0&d_2&\cdots&0\\ \vdots&\vdots&\ddots&\vdots\\ 0&0&\cdots&d_n\end{bmatrix}=d_1d_2\cdots d_n\)
Non-invertible matrix: det \(=0\)
Invertible matrix: Row reduce to identity
There exist elementary row operators \(E_i\) such that \[\left(\prod_i E_i\right) A = I\]
det\((A)=(-1)^{n_s}\dfrac{1}{\left(\prod_j c_j\right)}\)
\(A,B\): two square matrices
det\((AB)=\) det\((A)\) det\((B)\)
Proof
\(A\) or \(B\) non-invertible
\(AB\) is also non-invertible
det\((AB)=0=\) det\((A)\) det\((B)\)
\(A\) and \(B\) invertible
\(\left(\prod_i E_i\right) A = I\), \(\left(\prod_j F_j\right) B = I\)
\(\left(\prod_j F_j\right)\left(\prod_i E_i\right) AB = I\)
Corollary: If \(A\) is invertible, det\((A^{-1})=\dfrac{1}{\text{det}(A)}\)
Elementary column operators: \(E^T\), where \(E\) is an elementary row operator
det\((E^T)=\) det\((E)\)
det\((A^T)=\text{det}(A)\)
Proof
\(A\): non-invertible
\(A^T\) is non-invertible
det\((A^T)=0=\text{det}(A)\)
\(A\): invertible
\(E_1\cdots E_L A = I\) implies \(I=A^T E^T_L\cdots E^T_1\)
Take determinants
det: \(F^{n,n}\to F\) is a function from square matrices to the field \(F\) satisfying the following conditions or defining properties:
Identity: det\((I)=1\)
Row scaling: det\(([v_1;\ldots;cv_j;\ldots,v_n])=c\) det\(([v_1;\ldots;v_j;\ldots;v_n])\)
Row linearity: det\(([v_1;\ldots;v_j+v'_j;\ldots;v_n])=\) det\(([v_1;\ldots;v_j;\ldots;v_n])+\) det\(([v_1;\ldots;v'_j;\ldots;v_n])\)
Equal rows: det\(([v_1;\ldots;v_n])=0\), if any two rows are equal
Unique function that satisfies all of the above properties
Co-factor expansion along any row or column
Permutation formula