1.1.1. | Special Sets | |
1.1.2. | Set-Builder Notation | |
1.1.3. | Equivalent Sets | |
1.1.4. | Cardinality of Sets | |
1.1.5. | Subsets | |
1.1.6. | The Complement of a Set | |
1.1.7. | The Difference of Sets | |
1.1.8. | The Cartesian Product |
1.2.1. | The Vector Equation of a Line | |
1.2.2. | The Parametric Equations of a Line | |
1.2.3. | The Cartesian Equation of a Line | |
1.2.4. | The Vector Equation of a Plane | |
1.2.5. | The Cartesian Equation of a Plane | |
1.2.6. | The Parametric Equations of a Plane | |
1.2.7. | The Intersection of Two Planes | |
1.2.8. | The Shortest Distance Between a Plane and a Point |
1.3.1. | Statements and Propositions | |
1.3.2. | Universal and Existential Quantifiers | |
1.3.3. | Formal and Informal Language |
1.4.1. | Sets and Functions | |
1.4.2. | Surjections | |
1.4.3. | Injections | |
1.4.4. | Bijections |
2.5.1. | Cramer’s Rule for 2x2 Systems of Linear Equations | |
2.5.2. | Cramer’s Rule for 3x3 Systems | |
2.5.3. | The Determinant of an NxN Matrix | |
2.5.4. | Finding Determinants Using Laplace Expansions | |
2.5.5. | Basic Properties of Determinants | |
2.5.6. | Further Properties of Determinants | |
2.5.7. | Row and Column Operations on Determinants | |
2.5.8. | Conditions When a Determinant Equals Zero | |
2.5.9. | Finding Determinants Using Row and Column Operations | |
2.5.10. | Partitioned and Block Matrices |
2.6.1. | Systems of Equations as Augmented Matrices | |
2.6.2. | Row Echelon Form | |
2.6.3. | Solving Systems of Equations Using Back Substitution | |
2.6.4. | Elementary Row Operations | |
2.6.5. | Creating Rows or Columns Containing Zeros Using Gaussian Elimination | |
2.6.6. | Solving 2x2 Systems of Equations Using Gaussian Elimination | |
2.6.7. | Solving 2x2 Singular Systems of Equations Using Gaussian Elimination | |
2.6.8. | Solving 3x3 Systems of Equations Using Gaussian Elimination | |
2.6.9. | Identifying the Pivot Columns of a Matrix | |
2.6.10. | Solving 3x3 Singular Systems of Equations Using Gaussian Elimination | |
2.6.11. | Reduced Row Echelon Form | |
2.6.12. | Gaussian Elimination For NxM Systems of Equations |
2.7.1. | Elementary 2x2 Matrices | |
2.7.2. | Row Operations on 2x2 Matrices as Products of Elementary Matrices | |
2.7.3. | Elementary 3x3 Matrices | |
2.7.4. | Row Operations on 3x3 Matrices as Products of Elementary Matrices |
2.8.1. | The Invertible Matrix Theorem in Terms of 2x2 Systems of Equations | |
2.8.2. | Finding the Inverse of a 2x2 Matrix Using Row Operations | |
2.8.3. | Finding the Inverse of a 3x3 Matrix Using Row Operations | |
2.8.4. | Finding the Inverse of an NxN Square Matrix Using Row Operations | |
2.8.5. | Matrices With Easy-to-Find Inverses |
2.9.1. | Triangular Matrices | |
2.9.2. | LU Factorization of 2x2 Matrices | |
2.9.3. | LU Factorization of 3x3 Matrices | |
2.9.4. | LU Factorization of NxN Matrices | |
2.9.5. | Solving Systems of Equations Using LU Factorization |
3.10.1. | Vectors in N-Dimensional Euclidean Space | |
3.10.2. | Linear Combinations of Vectors in N-Dimensional Euclidean Space | |
3.10.3. | Linear Span of Vectors in N-Dimensional Euclidean Space | |
3.10.4. | Linear Dependence and Independence |
3.11.1. | Subspaces of N-Dimensional Space | |
3.11.2. | Subspaces of N-Dimensional Space: Geometric Interpretation | |
3.11.3. | The Column Space of a Matrix | |
3.11.4. | The Null Space of a Matrix |
3.12.1. | Finding a Basis of a Span | |
3.12.2. | Finding a Basis of the Column Space of a Matrix | |
3.12.3. | Finding a Basis of the Null Space of a Matrix | |
3.12.4. | Expressing the Coordinates of a Vector in a Given Basis | |
3.12.5. | Writing Vectors in Different Bases | |
3.12.6. | The Change-of-Coordinates Matrix | |
3.12.7. | Changing a Basis Using the Change-of-Coordinates Matrix |
3.13.1. | The Dimension of a Span | |
3.13.2. | The Rank of a Matrix | |
3.13.3. | The Dimension of the Null Space of a Matrix | |
3.13.4. | The Invertible Matrix Theorem in Terms of Dimension, Rank and Nullity | |
3.13.5. | The Rank-Nullity Theorem |
3.14.1. | Introduction to Abstract Vector Spaces | |
3.14.2. | Defining Abstract Vector Spaces | |
3.14.3. | Linear Independence in Abstract Vector Spaces | |
3.14.4. | Subspaces of Abstract Vector Spaces | |
3.14.5. | Bases in Abstract Vector Spaces | |
3.14.6. | The Coordinates of a Vector Relative to a Basis in Abstract Vector Spaces | |
3.14.7. | Dimension in Abstract Vector Spaces |
4.15.1. | The Standard Matrix of a Linear Transformation in Terms of the Standard Basis | |
4.15.2. | The Kernel of a Linear Transformation | |
4.15.3. | The Image and Rank of a Linear Transformation | |
4.15.4. | The Image of a Linear Transformation in the Cartesian Plane | |
4.15.5. | The Invertible Matrix Theorem in Terms of Linear Transformations | |
4.15.6. | The Rank-Nullity Theorem in Terms of Linear Transformations |
4.16.1. | The Matrix of a Linear Transformation Relative to a Basis | |
4.16.2. | Connections Between Matrix Representations of a Linear Transformation | |
4.16.3. | Linear Maps Between Two Different Vector Spaces | |
4.16.4. | The Matrix of a Linear Map Relative to Two Bases |
5.17.1. | The Eigenvalues and Eigenvectors of a 2x2 Matrix | |
5.17.2. | Calculating the Eigenvalues of a 2x2 Matrix | |
5.17.3. | Calculating the Eigenvectors of a 2x2 Matrix | |
5.17.4. | The Characteristic Equation of a Matrix | |
5.17.5. | The Cayley-Hamilton Theorem and Its Applications | |
5.17.6. | Calculating the Eigenvectors of a 3x3 Matrix With Distinct Eigenvalues | |
5.17.7. | Calculating the Eigenvectors of a 3x3 Matrix in the General Case | |
5.17.8. | The Invertible Matrix Theorem in Terms of Eigenvalues |
5.18.1. | Diagonalizing a 2x2 Matrix | |
5.18.2. | Properties of Diagonalization | |
5.18.3. | Diagonalizing a 3x3 Matrix With Distinct Eigenvalues | |
5.18.4. | Diagonalizing a 3x3 Matrix in the General Case |
5.19.1. | Vectors Over the Complex Numbers | |
5.19.2. | Matrices Over the Complex Numbers | |
5.19.3. | Finding Complex Eigenvalues of Real 2x2 Matrices | |
5.19.4. | Finding Complex Eigenvectors of Real 2x2 Matrices | |
5.19.5. | Rotation-Scaling Matrices | |
5.19.6. | Reducing Real 2x2 Matrices to Rotation-Scaling Form | |
5.19.7. | Block Diagonalization of NxN Matrices |
5.20.1. | Nilpotent and Idempotent Matrices | |
5.20.2. | Generalized Eigenvectors | |
5.20.3. | Ranks of Generalized Eigenvectors | |
5.20.4. | Finding Generalized Eigenvectors of Specific Ranks |
5.21.1. | Jordan Blocks and Jordan Matrices | |
5.21.2. | Jordan Canonical Form of a 2x2 Matrix | |
5.21.3. | Jordan Canonical Decomposition of a 2x2 Matrix | |
5.21.4. | Jordan Canonical Form of a 3x3 Matrix | |
5.21.5. | Jordan Canonical Decomposition of a 3x3 Matrix |
6.22.1. | The Dot Product in N-Dimensional Euclidean Space | |
6.22.2. | The Norm of a Vector in N-Dimensional Euclidean Space | |
6.22.3. | Inner Product Spaces | |
6.22.4. | The Inner Product in Vector Spaces Over the Complex Numbers | |
6.22.5. | The Norm of a Vector in Inner Product Spaces |
6.23.1. | Orthogonal Vectors in Euclidean Spaces | |
6.23.2. | Orthogonal Vectors in Inner Product Spaces | |
6.23.3. | The Cauchy-Schwarz Inequality and the Angle Between Two Vectors | |
6.23.4. | The Pythagorean Theorem and the Triangle Inequality | |
6.23.5. | Orthogonal Complements | |
6.23.6. | Orthogonal Sets in Euclidean Spaces | |
6.23.7. | Orthogonal Sets in Inner Product Spaces | |
6.23.8. | Orthogonal Matrices and Linear Transformations | |
6.23.9. | The Four Fundamental Subspaces of a Matrix |
6.24.1. | Projecting Vectors Onto One-Dimensional Subspaces | |
6.24.2. | The Components of a Vector with Respect to an Orthogonal or Orthonormal Basis | |
6.24.3. | Projecting Vectors Onto Subspaces in Euclidean Spaces (Orthogonal Bases) | |
6.24.4. | Projecting Vectors Onto Subspaces in Euclidean Spaces (Arbitrary Bases) | |
6.24.5. | Projecting Vectors Onto Subspaces in Euclidean Spaces (Arbitrary Bases): Applications | |
6.24.6. | Projection Matrices, Linear Transformations and Their Properties | |
6.24.7. | Projecting Vectors Onto Subspaces in Inner Product Spaces |
6.25.1. | The Gram-Schmidt Process for Two Vectors | |
6.25.2. | The Gram-Schmidt Process in the General Case | |
6.25.3. | QR Factorization |
7.26.1. | Symmetric Matrices | |
7.26.2. | Diagonalization of 2x2 Symmetric Matrices | |
7.26.3. | Diagonalization of 3x3 Symmetric Matrices | |
7.26.4. | The Spectral Theorem |
7.27.1. | Bilinear Forms | |
7.27.2. | Quadratic Forms | |
7.27.3. | Change of Variables in Quadratic Forms | |
7.27.4. | Finding the Canonical Form of a Quadratic Form Using Lagrange's Method | |
7.27.5. | Finding the Canonical Form of a Quadratic Form Using Orthogonal Transformations | |
7.27.6. | Positive-Definite and Negative-Definite Quadratic Forms |
7.28.1. | Reducing a Quadratic Curve to Its Principal Axes | |
7.28.2. | Classifying Quadratic Curves |
7.29.1. | Constrained Optimization of Quadratic Forms | |
7.29.2. | The Singular Values of a Matrix | |
7.29.3. | Computing the Singular Values of a Matrix | |
7.29.4. | Singular Value Decomposition of 2x2 Matrices | |
7.29.5. | Singular Value Decomposition of 2x2 Matrices With Zero or Repeated Eigenvalues | |
7.29.6. | Singular Value Decomposition of Larger Matrices | |
7.29.7. | Singular Value Decomposition and the Pseudoinverse Matrix |
8.30.1. | The Least-Squares Solution of a Linear System (Without Collinearity) | |
8.30.2. | The Least-Squares Solution of a Linear System (With Collinearity) | |
8.30.3. | Finding a Least-Squares Solution Using QR Factorization | |
8.30.4. | Weighted Least-Squares |
8.31.1. | Markov Chains | |
8.31.2. | Steady-State Vectors |
8.32.1. | Linear Regression | |
8.32.2. | Polynomial Regression | |
8.32.3. | Multiple Linear Regression |