dipta007's Second 🧠
  • GitHub

    main

    • Home

    Literature Notes

    • Advanced NLP With Scipy
    • Deep Learning By Ian Goodfellow
    • How To Read A Paper
    • How To Write A Paper

    Templates

    • Literature Notes
    • Permanent Notes

    Zettelkasten

    • Activation-Functions
    • Conditionally-Independent-Joint-Distribution
    • Contrastive-Learning
    • Determinant
    • Diagonal-Matrix
    • Doing-Literature-Review
    • Eigendecomposition
    • Eigenvalue-Eigenvector
    • Eucledian-Norm
    • Frobenius-Norm
    • Fully-Independent-Join-Distribution
    • Fully-Joint-Joint-Distribution
    • How-To-Read-Paper
    • Identity-Matrix
    • Joint-Distribuition
    • Jupyter-Notebook-On-Server
    • Lp-Norm
    • Matrices
    • Max-Norm
    • Norm
    • Orthogonal-Matrix
    • Orthonormal-Vector
    • Probability-Density-Function
    • Probability-Distribution
    • Probability-Mass-Function
    • Random-Variable
    • Scalar
    • Sigmoid-Function
    • Singular-Value-Decomposition (Svd)
    • Sources-Of-Uncertainity
    • Spacy-Doc-Object
    • Spacy-Doc-Span-Token
    • Spacy-Explanation-Of-Labels
    • Spacy-Matcher
    • Spacy-Named-Entities
    • Spacy-Operator-Quantifier
    • Spacy-Pattern
    • Spacy-Pipeline
    • Spacy-Pos
    • Spacy-Semantic-Similarity
    • Spacy-Syntactic-Dependency
    • Trace-Operator
    • Transformer-Timeline
    • Unit-Vector
    • Untitled
    • Vector

    On this page

    • determinant
    • References

    Determinant

    07-09-2022 || 18:09
    Tags: #linear-algebra

    determinant

    The determinant is expressed as $det(A)$ and is the product of all the eigenvalues.

    The absolute value of the determinant gives an intuition on how much it will expand or contract after a multiplication.


    References

    • Deep Learning by Ian Goodfellow