banner



How To Find Number Of Linearly Independent Eigenvectors

Eigenvectors corresponding to distinct eigenvalues are linearly independent. As a upshot, if all the eigenvalues of a matrix are distinct, then their corresponding eigenvectors span the space of column vectors to which the columns of the matrix belong.

If at that place are repeated eigenvalues, only they are non defective (i.e., their algebraic multiplicity equals their geometric multiplicity), the same spanning consequence holds.

Nonetheless, if there is at to the lowest degree one defective repeated eigenvalue, and then the spanning fails.

These results volition exist formally stated, proved and illustrated in particular in the remainder of this lecture.

Table of Contents

Table of contents

  1. Independence of eigenvectors corresponding to different eigenvalues

  2. Independence of eigenvectors when no repeated eigenvalue is lacking

  3. Lacking matrices practice not accept a complete basis of eigenvectors

  4. Solved exercises

    1. Practice 1

    2. Practise ii

We now deal with distinct eigenvalues.

Proposition Let A be a $K	imes K$ matrix. Let [eq1] ( $Mleq $ K ) be eigenvalues of A and choose [eq2] associated eigenvectors. If there are no repeated eigenvalues (i.due east., [eq3] are distinct), then the eigenvectors [eq4] are linearly independent.

Proof

The proof is past contradiction. Suppose that [eq4] are not linearly contained. Denote by I the largest number of linearly contained eigenvectors. If necessary, re-number eigenvalues and eigenvectors, so that [eq6] are linearly independent. Note that $Igeq 1$ considering a single vector trivially forms by itself a set of linearly contained vectors. Moreover, $I<M$ because otherwise [eq7] would be linearly independent, a contradiction. Now, $x_{I+1}$ can be written as a linear combination of [eq8] : [eq9] where [eq10] are scalars and they are not all zero (otherwise $x_{I+1}$ would be naught and hence not an eigenvector). By the definition of eigenvalues and eigenvectors we have that [eq11] and that [eq12]
Past subtracting the second equation from the get-go, we obtain [eq13] Since [eq14] are distinct, [eq15] for $i=1,ldots ,I$ . Furthermore, [eq16] are linearly independent, so that their but linear combination giving the nix vector has all zero coefficients. Equally a consequence, information technology must exist that [eq17] . Only we have already explained that these coefficients cannot all be null. Thus, nosotros have arrived at a contradiction, starting from the initial hypothesis that [eq4] are not linearly independent. Therefore, [eq4] must be linearly independent.

When $M=K$ in the proposition above, then there are K distinct eigenvalues and K linearly contained eigenvectors, which bridge (i.due east., they grade a footing for) the space of K -dimensional cavalcade vectors (to which the columns of A vest).

Example Define the $3	imes 3$ matrix [eq20] It has 3 eigenvalues [eq21] with associated eigenvectors [eq22] which you can verify by checking that [eq23] (for $k=1,ldots ,3$ ). The iii eigenvalues $lambda _{1}$ , $lambda _{2}$ and $lambda _{3}$ are distinct (no two of them are equal to each other). Therefore, the three corresponding eigenvectors $x_{1}$ , $x_{2}$ and $x_{3}$ are linearly independent, which you lot tin can likewise verify by checking that none of them can be written as a linear combination of the other ii. These three eigenvectors form a basis for the infinite of all $3	imes 1$ vectors, that is, a vector [eq24] can be written as a linear combination of the eigenvectors $x_{1}$ , $x_{2}$ and $x_{3}$ for any pick of the entries $lpha $ , $eta $ and $gamma $ .

We at present bargain with the case in which some of the eigenvalues are repeated.

Proof

Denote by [eq25] the K eigenvalues of A and by [eq26] a list of corresponding eigenvectors chosen in such a fashion that $x_{j}$ is linearly contained of $x_{k}$ whenever there is a repeated eigenvalue [eq27] . The choice of eigenvectors can be performed in this mode because the repeated eigenvalues are non defective by assumption. Now, past contradiction, suppose that [eq28] are non linearly contained. Then, there exist scalars [eq29] not all equal to zero such that [eq30] Denote by $M$ the number of distinct eigenvalues. Without loss of generality (i.e., after re-numbering the eigenvalues if necessary), we can presume that the first $M$ eigenvalues are distinct. For $m=1,ldots ,M$ , ascertain the sets of indices corresponding to groups of equal eigenvalues [eq31] and the vectors [eq32] Then, equation (i) becomes [eq33] Announce by $J$ the following ready of indices: [eq34] The fix $J$ must exist non-empty considering [eq35] are not all equal to zero and the previous pick of linearly independent eigenvectors respective to a repeated eigenvalue implies that the vectors $u_{m}$ in equation (2) cannot be made equal to zero past appropriately choosing positive coefficients $c_{k}$ . Then, we take [eq36] Merely, for whatsoever $jin J$ , $u_{j}$ is an eigenvector (because eigenspaces are closed with respect to linear combinations). This means that a linear combination (with coefficients all equal to 1 ) of eigenvectors respective to distinct eigenvalues is equal to 0 . Hence, those eigenvectors are linearly dependent. Simply this contradicts the fact, proved previously, that eigenvectors respective to different eigenvalues are linearly contained. Thus, nosotros have arrived at a contradiction. Hence, the initial claim that [eq37] are not linearly independent must exist wrong. As a consequence, [eq38] are linearly contained.

Thus, when in that location are repeated eigenvalues, but none of them is lacking, we tin choose K linearly independent eigenvectors, which bridge the space of K -dimensional cavalcade vectors (to which the columns of A belong).

Example Define the $3	imes 3$ matrix [eq39] It has three eigenvalues [eq40] with associated eigenvectors [eq41] which you lot tin can verify past checking that [eq23] (for $k=1,ldots ,3$ ). The three eigenvalues are not singled-out because there is a repeated eigenvalue [eq43] whose algebraic multiplicity equals two. Notwithstanding, the 2 eigenvectors $x_{1}$ and $x_{2}$ associated to the repeated eigenvalue are linearly independent considering they are non a multiple of each other. Every bit a result, likewise the geometric multiplicity equals two. Thus, the repeated eigenvalue is not defective. Therefore, the three eigenvectors $x_{1}$ , $x_{2}$ and $x_{3}$ are linearly independent, which you tin can as well verify by checking that none of them tin be written as a linear combination of the other two. These 3 eigenvectors class a ground for the infinite of all $3	imes 1$ vectors.

The last proposition concerns defective matrices, that is, matrices that have at least 1 defective eigenvalue.

Proffer Let A exist a $K	imes K$ matrix. If A has at least ane defective eigenvalue (whose geometric multiplicity is strictly less than its algebraic multiplicity), then there does non be a set of K linearly independent eigenvectors of A .

Proof

Thus, in the unlucky case in which A is a defective matrix, there is no mode to class a basis of eigenvectors of A for the space of K -dimensional column vectors to which the columns of A belong.

Instance Consider the $2	imes 2$ matrix [eq44] The characteristic polynomial is [eq45] and its roots are [eq46] Thus, there is a repeated eigenvalue ( [eq47] ) with algebraic multiplicity equal to 2. Its associated eigenvectors [eq48] solve the equation [eq49] or [eq50] which is satisfied for $x_{11}=0$ and whatsoever value of $x_{21}$ . Hence, the eigenspace of $lambda _{1}$ is the linear infinite that contains all vectors $x_{1}$ of the form [eq51] where $lpha $ can be any scalar. In other words, the eigenspace of $lambda _{1}$ is generated by a single vector [eq52] Hence, it has dimension 1 and the geometric multiplicity of $lambda _{1}$ is i, less than its algebraic multiplicity, which is equal to 2. This implies that at that place is no mode of forming a basis of eigenvectors of A for the space of two-dimensional cavalcade vectors. For instance, the vector [eq53] cannot be written equally a multiple of the eigenvector $x_{1}$ . Thus, there is at least one two-dimensional vector that cannot be written as a linear combination of the eigenvectors of A .

Beneath yous can detect some exercises with explained solutions.

Exercise 1

Consider the matrix $2	imes 1$ [eq54]

Try to find a set of eigenvectors of A that spans the ready of all $2	imes 1 $ vectors.

Solution

The feature polynomial is [eq55] and its roots are [eq56] Since there are two distinct eigenvalues, we already know that nosotros will exist able to find two linearly independent eigenvectors. Permit's find them. The eigenvector [eq57] associated to $lambda _{1}$ solves the equation [eq58] or [eq59] which is satisfied for any couple of values $x_{11},x_{21}$ such that [eq60] or [eq61] For example, we can choose $x_{21}=2$ , then that $x_{11}=-3$ and the eigenvector associated to $lambda _{1}$ is [eq62] The eigenvector [eq63] associated to $lambda _{2}$ solves the equation [eq64] or [eq65] which is satisfied for any couple of values $x_{12},x_{22}$ such that [eq66] or [eq67] For instance, we can cull $x_{12}=1$ , so that $x_{22}=1$ and the eigenvector associated to $lambda _{2}$ is [eq68] Thus, $x_{1}$ and $x_{2}$ form the footing of eigenvectors we were searching for.

Exercise 2

Define [eq69]

Effort to find a set of eigenvectors of A that spans the gear up of all cavalcade vectors having the same dimension as the columns of A .

Solution

The characteristic polynomial is [eq70] where in stride $rame{A}$ we have used the Laplace expansion along the third row. The roots of the polynomial are [eq71] Hence, [eq72] is a repeated eigenvalue with algebraic multiplicity equal to 2. Its associated eigenvectors [eq73] solve the equation [eq74] or [eq75] This organisation of equations is satisfied for any value of $x_{22}$ and $x_{12}=x_{32}=0$ . As a consequence, the eigenspace of $lambda _{2}$ contains all the vectors $x_{2}$ that tin be written as [eq76] where the scalar $x_{22}$ can be arbitrarily chosen. Thus, the eigenspace of $lambda _{2}$ is generated past a single vector [eq77] Hence, the eigenspace has dimension 1 and the geometric multiplicity of $lambda _{2}$ is 1, less than its algebraic multiplicity, which is equal to 2. It follows that the matrix A is defective and we cannot construct a ground of eigenvectors of A that spans the infinite of $3	imes 1$ vectors.

Please cite as:

Taboga, Marco (2021). "Linear independence of eigenvectors", Lectures on matrix algebra. https://world wide web.statlect.com/matrix-algebra/linear-independence-of-eigenvectors.

Source: https://www.statlect.com/matrix-algebra/linear-independence-of-eigenvectors

Posted by: teskefrousess.blogspot.com

0 Response to "How To Find Number Of Linearly Independent Eigenvectors"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel