Jump to content
 







Main menu
   


Navigation  



Main page
Contents
Current events
Random article
About Wikipedia
Contact us
Donate
 




Contribute  



Help
Learn to edit
Community portal
Recent changes
Upload file
 








Search  

































Create account

Log in
 









Create account
 Log in
 




Pages for logged out editors learn more  



Contributions
Talk
 



















Contents

   



(Top)
 


1 Definition  





2 Vector-to-matrix diag operator  





3 Matrix-to-vector diag operator  





4 Scalar matrix  





5 Vector operations  





6 Matrix operations  





7 Operator matrix in eigenbasis  





8 Properties  





9 Applications  





10 Operator theory  





11 See also  





12 Notes  





13 References  





14 Sources  














Diagonal matrix






العربية
Català
Чӑвашла
Čeština
Dansk
Deutsch
Eesti
Ελληνικά
Español
Esperanto
Euskara
فارسی
Français
Galego

Bahasa Indonesia
Interlingua
Íslenska
Italiano
עברית
Latina
Magyar
Nederlands

Polski
Português
Română
Русский
Slovenščina
Српски / srpski
Suomi
Svenska
ி

Türkçe
Українська
اردو

 

Edit links
 









Article
Talk
 

















Read
Edit
View history
 








Tools
   


Actions  



Read
Edit
View history
 




General  



What links here
Related changes
Upload file
Special pages
Permanent link
Page information
Cite this page
Get shortened URL
Download QR code
Wikidata item
 




Print/export  



Download as PDF
Printable version
 
















Appearance
   

 






From Wikipedia, the free encyclopedia
 


Inlinear algebra, a diagonal matrix is a matrix in which the entries outside the main diagonal are all zero; the term usually refers to square matrices. Elements of the main diagonal can either be zero or nonzero. An example of a 2×2 diagonal matrix is , while an example of a 3×3 diagonal matrix is. An identity matrix of any size, or any multiple of it is a diagonal matrix called a scalar matrix, for example, . In geometry, a diagonal matrix may be used as a scaling matrix, since matrix multiplication with it results in changing scale (size) and possibly also shape; only a scalar matrix results in uniform change in scale.

Definition[edit]

As stated above, a diagonal matrix is a matrix in which all off-diagonal entries are zero. That is, the matrix D = (di,j) with n columns and n rows is diagonal if

However, the main diagonal entries are unrestricted.

The term diagonal matrix may sometimes refer to a rectangular diagonal matrix, which is an m-by-n matrix with all the entries not of the form di,i being zero. For example:

More often, however, diagonal matrix refers to square matrices, which can be specified explicitly as a square diagonal matrix. A square diagonal matrix is a symmetric matrix, so this can also be called a symmetric diagonal matrix.

The following matrix is square diagonal matrix:

If the entries are real numbersorcomplex numbers, then it is a normal matrix as well.

In the remainder of this article we will consider only square diagonal matrices, and refer to them simply as "diagonal matrices".

Vector-to-matrix diag operator[edit]

A diagonal matrix can be constructed from a vector using the operator:

This may be written more compactly as .

The same operator is also used to represent block diagonal matricesas where each argument is a matrix.

The operator may be written as: where represents the Hadamard product and is a constant vector with elements 1.

Matrix-to-vector diag operator[edit]

The inverse matrix-to-vector operator is sometimes denoted by the identically named where the argument is now a matrix and the result is a vector of its diagonal entries.

The following property holds:

Scalar matrix[edit]

A diagonal matrix with equal diagonal entries is a scalar matrix; that is, a scalar multiple λ of the identity matrix I. Its effect on a vectorisscalar multiplicationbyλ. For example, a 3×3 scalar matrix has the form:

The scalar matrices are the center of the algebra of matrices: that is, they are precisely the matrices that commute with all other square matrices of the same size.[a] By contrast, over a field (like the real numbers), a diagonal matrix with all diagonal elements distinct only commutes with diagonal matrices (its centralizer is the set of diagonal matrices). That is because if a diagonal matrix has then given a matrix with the term of the products are: and and (since one can divide by ), so they do not commute unless the off-diagonal terms are zero.[b] Diagonal matrices where the diagonal entries are not all equal or all distinct have centralizers intermediate between the whole space and only diagonal matrices.[1]

For an abstract vector space V (rather than the concrete vector space ), the analog of scalar matrices are scalar transformations. This is true more generally for a module M over a ring R, with the endomorphism algebra End(M) (algebra of linear operators on M) replacing the algebra of matrices. Formally, scalar multiplication is a linear map, inducing a map (from a scalar λ to its corresponding scalar transformation, multiplication by λ) exhibiting End(M) as a R-algebra. For vector spaces, the scalar transforms are exactly the center of the endomorphism algebra, and, similarly, scalar invertible transforms are the center of the general linear group GL(V). The former is more generally true free modules , for which the endomorphism algebra is isomorphic to a matrix algebra.

Vector operations[edit]

Multiplying a vector by a diagonal matrix multiplies each of the terms by the corresponding diagonal entry. Given a diagonal matrix and a vector , the product is:

This can be expressed more compactly by using a vector instead of a diagonal matrix, , and taking the Hadamard product of the vectors (entrywise product), denoted :

This is mathematically equivalent, but avoids storing all the zero terms of this sparse matrix. This product is thus used in machine learning, such as computing products of derivatives in backpropagation or multiplying IDF weights in TF-IDF,[2] since some BLAS frameworks, which multiply matrices efficiently, do not include Hadamard product capability directly.[3]

Matrix operations[edit]

The operations of matrix addition and matrix multiplication are especially simple for diagonal matrices. Write diag(a1, ..., an) for a diagonal matrix whose diagonal entries starting in the upper left corner are a1, ..., an. Then, for addition, we have

and for matrix multiplication,

The diagonal matrix diag(a1, ..., an)isinvertible if and only if the entries a1, ..., an are all nonzero. In this case, we have

In particular, the diagonal matrices form a subring of the ring of all n-by-n matrices.

Multiplying an n-by-n matrix A from the left with diag(a1, ..., an) amounts to multiplying the i-th rowofAbyai for all i; multiplying the matrix A from the right with diag(a1, ..., an) amounts to multiplying the i-th columnofAbyai for all i.

Operator matrix in eigenbasis[edit]

As explained in determining coefficients of operator matrix, there is a special basis, e1, ..., en, for which the matrix takes the diagonal form. Hence, in the defining equation , all coefficients with ij are zero, leaving only one term per sum. The surviving diagonal elements, , are known as eigenvalues and designated with in the equation, which reduces to . The resulting equation is known as eigenvalue equation[4] and used to derive the characteristic polynomial and, further, eigenvalues and eigenvectors.

In other words, the eigenvaluesofdiag(λ1, ..., λn) are λ1, ..., λn with associated eigenvectorsofe1, ..., en.

Properties[edit]

Applications[edit]

Diagonal matrices occur in many areas of linear algebra. Because of the simple description of the matrix operation and eigenvalues/eigenvectors given above, it is typically desirable to represent a given matrix or linear map by a diagonal matrix.

In fact, a given n-by-n matrix Aissimilar to a diagonal matrix (meaning that there is a matrix X such that X−1AX is diagonal) if and only if it has n linearly independent eigenvectors. Such matrices are said to be diagonalizable.

Over the fieldofrealorcomplex numbers, more is true. The spectral theorem says that every normal matrixisunitarily similar to a diagonal matrix (ifAA = AA then there exists a unitary matrix U such that UAU is diagonal). Furthermore, the singular value decomposition implies that for any matrix A, there exist unitary matrices U and V such that UAV is diagonal with positive entries.

Operator theory[edit]

Inoperator theory, particularly the study of PDEs, operators are particularly easy to understand and PDEs easy to solve if the operator is diagonal with respect to the basis with which one is working; this corresponds to a separable partial differential equation. Therefore, a key technique to understanding operators is a change of coordinates—in the language of operators, an integral transform—which changes the basis to an eigenbasisofeigenfunctions: which makes the equation separable. An important example of this is the Fourier transform, which diagonalizes constant coefficient differentiation operators (or more generally translation invariant operators), such as the Laplacian operator, say, in the heat equation.

Especially easy are multiplication operators, which are defined as multiplication by (the values of) a fixed function–the values of the function at each point correspond to the diagonal entries of a matrix.

See also[edit]

  • Banded matrix
  • Bidiagonal matrix
  • Diagonally dominant matrix
  • Diagonalizable matrix
  • Jordan normal form
  • Multiplication operator
  • Tridiagonal matrix
  • Toeplitz matrix
  • Toral Lie algebra
  • Circulant matrix
  • Notes[edit]

    1. ^ Proof: given the elementary matrix , is the matrix with only the i-th row of M and is the square matrix with only the M j-th column, so the non-diagonal entries must be zero, and the ith diagonal entry much equal the jth diagonal entry.
  • ^ Over more general rings, this does not hold, because one cannot always divide.
  • References[edit]

    1. ^ "Do Diagonal Matrices Always Commute?". Stack Exchange. March 15, 2016. Retrieved August 4, 2018.
  • ^ Sahami, Mehran (2009-06-15). Text Mining: Classification, Clustering, and Applications. CRC Press. p. 14. ISBN 9781420059458.
  • ^ "Element-wise vector-vector multiplication in BLAS?". stackoverflow.com. 2011-10-01. Retrieved 2020-08-30.
  • ^ Nearing, James (2010). "Chapter 7.9: Eigenvalues and Eigenvectors" (PDF). Mathematical Tools for Physics. ISBN 978-0486482125. Retrieved January 1, 2012.
  • Sources[edit]


    Retrieved from "https://en.wikipedia.org/w/index.php?title=Diagonal_matrix&oldid=1223583960"

    Categories: 
    Matrix normal forms
    Sparse matrices
    Hidden categories: 
    Use American English from March 2019
    All Wikipedia articles written in American English
    Articles with short description
    Short description matches Wikidata
    Wikipedia articles needing clarification from February 2021
    All Wikipedia articles needing clarification
     



    This page was last edited on 13 May 2024, at 02:48 (UTC).

    Text is available under the Creative Commons Attribution-ShareAlike License 4.0; additional terms may apply. By using this site, you agree to the Terms of Use and Privacy Policy. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc., a non-profit organization.



    Privacy policy

    About Wikipedia

    Disclaimers

    Contact Wikipedia

    Code of Conduct

    Developers

    Statistics

    Cookie statement

    Mobile view



    Wikimedia Foundation
    Powered by MediaWiki