Jump to content
 







Main menu
   


Navigation  



Main page
Contents
Current events
Random article
About Wikipedia
Contact us
Donate
 




Contribute  



Help
Learn to edit
Community portal
Recent changes
Upload file
 








Search  

































Create account

Log in
 









Create account
 Log in
 




Pages for logged out editors learn more  



Contributions
Talk
 



















Contents

   



(Top)
 


1 Example  





2 Properties  



2.1  Basic properties  





2.2  Decomposition into symmetric and skew-symmetric  





2.3  Matrix congruent to a symmetric matrix  





2.4  Symmetry implies normality  





2.5  Real symmetric matrices  





2.6  Complex symmetric matrices  







3 Decomposition  





4 Hessian  





5 Symmetrizable matrix  





6 See also  





7 Notes  





8 References  





9 External links  














Symmetric matrix






العربية
Беларуская
Български
Català
Чӑвашла
Čeština
Dansk
Deutsch
Eesti
Ελληνικά
Español
Esperanto
Euskara
فارسی
Français

Bahasa Indonesia
Interlingua
Italiano
עברית
Magyar
Nederlands

Norsk bokmål
Norsk nynorsk
Polski
Português
Română
Русский
Slovenščina
Suomi
Svenska
ி

Türkçe
Українська
اردو
Tiếng Vit

 

Edit links
 









Article
Talk
 

















Read
Edit
View history
 








Tools
   


Actions  



Read
Edit
View history
 




General  



What links here
Related changes
Upload file
Special pages
Permanent link
Page information
Cite this page
Get shortened URL
Download QR code
Wikidata item
 




Print/export  



Download as PDF
Printable version
 
















Appearance
   

 






From Wikipedia, the free encyclopedia
 

(Redirected from Symmetric matrices)

Symmetry of a 5×5 matrix

Inlinear algebra, a symmetric matrix is a square matrix that is equal to its transpose. Formally,

Because equal matrices have equal dimensions, only square matrices can be symmetric.

The entries of a symmetric matrix are symmetric with respect to the main diagonal. So if denotes the entry in the th row and th column then

for all indices and

Every square diagonal matrix is symmetric, since all off-diagonal elements are zero. Similarly in characteristic different from 2, each diagonal element of a skew-symmetric matrix must be zero, since each is its own negative.

In linear algebra, a real symmetric matrix represents a self-adjoint operator[1] represented in an orthonormal basis over a real inner product space. The corresponding object for a complex inner product space is a Hermitian matrix with complex-valued entries, which is equal to its conjugate transpose. Therefore, in linear algebra over the complex numbers, it is often assumed that a symmetric matrix refers to one which has real-valued entries. Symmetric matrices appear naturally in a variety of applications, and typical numerical linear algebra software makes special accommodations for them.

Example[edit]

The following matrix is symmetric: Since .

Properties[edit]

Basic properties[edit]

Decomposition into symmetric and skew-symmetric[edit]

Any square matrix can uniquely be written as sum of a symmetric and a skew-symmetric matrix. This decomposition is known as the Toeplitz decomposition. Let denote the space of matrices. If denotes the space of symmetric matrices and the space of skew-symmetric matrices then and , i.e. where denotes the direct sum. Let then

Notice that and . This is true for every square matrix with entries from any field whose characteristic is different from 2.

A symmetric matrix is determined by scalars (the number of entries on or above the main diagonal). Similarly, a skew-symmetric matrix is determined by scalars (the number of entries above the main diagonal).

Matrix congruent to a symmetric matrix[edit]

Any matrix congruent to a symmetric matrix is again symmetric: if is a symmetric matrix, then so is for any matrix .

Symmetry implies normality[edit]

A (real-valued) symmetric matrix is necessarily a normal matrix.

Real symmetric matrices[edit]

Denote by the standard inner producton. The real matrix is symmetric if and only if

Since this definition is independent of the choice of basis, symmetry is a property that depends only on the linear operator A and a choice of inner product. This characterization of symmetry is useful, for example, in differential geometry, for each tangent space to a manifold may be endowed with an inner product, giving rise to what is called a Riemannian manifold. Another area where this formulation is used is in Hilbert spaces.

The finite-dimensional spectral theorem says that any symmetric matrix whose entries are real can be diagonalized by an orthogonal matrix. More explicitly: For every real symmetric matrix there exists a real orthogonal matrix such that is a diagonal matrix. Every real symmetric matrix is thus, up to choice of an orthonormal basis, a diagonal matrix.

If and are real symmetric matrices that commute, then they can be simultaneously diagonalized by an orthogonal matrix:[2] there exists a basis of such that every element of the basis is an eigenvector for both and .

Every real symmetric matrix is Hermitian, and therefore all its eigenvalues are real. (In fact, the eigenvalues are the entries in the diagonal matrix (above), and therefore is uniquely determined by up to the order of its entries.) Essentially, the property of being symmetric for real matrices corresponds to the property of being Hermitian for complex matrices.

Complex symmetric matrices [edit]

A complex symmetric matrix can be 'diagonalized' using a unitary matrix: thus if is a complex symmetric matrix, there is a unitary matrix such that is a real diagonal matrix with non-negative entries. This result is referred to as the Autonne–Takagi factorization. It was originally proved by Léon Autonne (1915) and Teiji Takagi (1925) and rediscovered with different proofs by several other mathematicians.[3][4] In fact, the matrix is Hermitian and positive semi-definite, so there is a unitary matrix such that is diagonal with non-negative real entries. Thus is complex symmetric with real. Writing with and real symmetric matrices, . Thus . Since and commute, there is a real orthogonal matrix such that both and are diagonal. Setting (a unitary matrix), the matrix is complex diagonal. Pre-multiplying by a suitable diagonal unitary matrix (which preserves unitarity of ), the diagonal entries of can be made to be real and non-negative as desired. To construct this matrix, we express the diagonal matrix as . The matrix we seek is simply given by . Clearly as desired, so we make the modification . Since their squares are the eigenvalues of , they coincide with the singular valuesof. (Note, about the eigen-decomposition of a complex symmetric matrix , the Jordan normal form of may not be diagonal, therefore may not be diagonalized by any similarity transformation.)

Decomposition[edit]

Using the Jordan normal form, one can prove that every square real matrix can be written as a product of two real symmetric matrices, and every square complex matrix can be written as a product of two complex symmetric matrices.[5]

Every real non-singular matrix can be uniquely factored as the product of an orthogonal matrix and a symmetric positive definite matrix, which is called a polar decomposition. Singular matrices can also be factored, but not uniquely.

Cholesky decomposition states that every real positive-definite symmetric matrix is a product of a lower-triangular matrix and its transpose,

If the matrix is symmetric indefinite, it may be still decomposed as where is a permutation matrix (arising from the need to pivot), a lower unit triangular matrix, and is a direct sum of symmetric and blocks, which is called Bunch–Kaufman decomposition [6]

A general (complex) symmetric matrix may be defective and thus not be diagonalizable. If is diagonalizable it may be decomposed as where is an orthogonal matrix , and is a diagonal matrix of the eigenvalues of . In the special case that is real symmetric, then and are also real. To see orthogonality, suppose and are eigenvectors corresponding to distinct eigenvalues , . Then

Since and are distinct, we have .

Hessian[edit]

Symmetric matrices of real functions appear as the Hessians of twice differentiable functions of real variables (the continuity of the second derivative is not needed, despite common belief to the opposite[7]).

Every quadratic form on can be uniquely written in the form with a symmetric matrix . Because of the above spectral theorem, one can then say that every quadratic form, up to the choice of an orthonormal basis of , "looks like" with real numbers . This considerably simplifies the study of quadratic forms, as well as the study of the level sets which are generalizations of conic sections.

This is important partly because the second-order behavior of every smooth multi-variable function is described by the quadratic form belonging to the function's Hessian; this is a consequence of Taylor's theorem.

Symmetrizable matrix[edit]

An matrix is said to be symmetrizable if there exists an invertible diagonal matrix and symmetric matrix such that

The transpose of a symmetrizable matrix is symmetrizable, since and is symmetric. A matrix is symmetrizable if and only if the following conditions are met:

  1. implies for all
  2. for any finite sequence

See also[edit]

Other types of symmetry or pattern in square matrices have special names; see for example:

  • Centrosymmetric matrix
  • Circulant matrix
  • Covariance matrix
  • Coxeter matrix
  • GCD matrix
  • Hankel matrix
  • Hilbert matrix
  • Persymmetric matrix
  • Sylvester's law of inertia
  • Toeplitz matrix
  • Transpositions matrix
  • See also symmetry in mathematics.

    Notes[edit]

    1. ^ Jesús Rojo García (1986). Álgebra lineal (in Spanish) (2nd ed.). Editorial AC. ISBN 84-7288-120-2.
  • ^ Richard Bellman (1997). Introduction to Matrix Analysis (2nd ed.). SIAM. ISBN 08-9871-399-4.
  • ^ Horn, R.A.; Johnson, C.R. (2013). Matrix analysis (2nd ed.). Cambridge University Press. pp. 263, 278. MR 2978290.
  • ^ See:
  • ^ Bosch, A. J. (1986). "The factorization of a square matrix into two symmetric matrices". American Mathematical Monthly. 93 (6): 462–464. doi:10.2307/2323471. JSTOR 2323471.
  • ^ G.H. Golub, C.F. van Loan. (1996). Matrix Computations. The Johns Hopkins University Press, Baltimore, London.
  • ^ Dieudonné, Jean A. (1969). Foundations of Modern Analysis (Enlarged and Corrected printing ed.). Academic Press. pp. Theorem (8.12.2), p. 180. ISBN 978-1443724265.
  • References[edit]

    External links[edit]


    Retrieved from "https://en.wikipedia.org/w/index.php?title=Symmetric_matrix&oldid=1214708238"

    Category: 
    Matrices
    Hidden categories: 
    CS1 Spanish-language sources (es)
    Articles with short description
    Short description matches Wikidata
    Use American English from January 2019
    All Wikipedia articles written in American English
    Articles with GND identifiers
    Articles with J9U identifiers
    Articles with LCCN identifiers
     



    This page was last edited on 20 March 2024, at 17:24 (UTC).

    Text is available under the Creative Commons Attribution-ShareAlike License 4.0; additional terms may apply. By using this site, you agree to the Terms of Use and Privacy Policy. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc., a non-profit organization.



    Privacy policy

    About Wikipedia

    Disclaimers

    Contact Wikipedia

    Code of Conduct

    Developers

    Statistics

    Cookie statement

    Mobile view



    Wikimedia Foundation
    Powered by MediaWiki