## Definition

The literal meaning of “eigen” is characteristic which comes from the German language. These vectors are aptly named as these vectors ultimately give the trend something follows. Any matrix that has an eigenvector, all its coordinates will ultimately follow a relation as close to the eigenvector as possible.^{1}

## Who

For those who learned about eigenvalues and eigenvectors in a linear algebra course and presentation of the material may have been very dry and mathematical. So you do not understand the usefulness of eigenvectors and eigenvalues. You are not alone. Here is what some have said about learning linear algebra:

I am teaching Mathematics in an engineering college. We just teach students how to solve the problems but not why it is essential and where it is applied in the real life. But this video is just more than what I expect to teach to my students. Thank you for this wonderful video and visual demonstration on eigenvectors.^{3}

In my freshman year of college, Linear Algebra was part of the first topics taken in Engineering Mathematics. I always skipped the section of Eigenvectors and Eigenvalues, due to poor understanding and didn’t see much use of it. In my recent research, I’ve come to see the practical application of them.^{4}

I wish I had this in college. I struggled with this subject so much.^{5}

## What

Eigenvectors and eigenvalues show us the general trend of how a system changes. The eigenvalues show us the magnitude of the rate of change of the system and the eigenvectors shows us the direction that change is taking place in.^{1}

Eigenvectors are the vectors which when multiplied by a matrix (linear combination or transformation) results in another vector having same direction but scaled (hence scaler multiple) in forward or reverse direction by a magnitude of the scaler multiple which can be termed as eigenvalue. In simpler words, eigenvalue can be seen as the scaling factor for eigenvectors. Here is the formula for what is called eigenequation.

Ax=λx

In the above equation, the matrix A acts on the vector x and the outcome is another vector Ax having same direction as original vector x but scaled / shrunk in forward or reverse direction by a magnitude of scaler multiple, λ. The vector x is called as eigenvector of A and λ is called its eigenvalue. Let’s understand what pictorially what happens when a matrix A acts on a vector x. Note that the new vector Ax has different direction than vector x.^{2}

**What are Eigenvectors?**

We know that vectors have both magnitude and direction when plotted on an XY (2-dim) plane. As required for this article, linear transformation of a vector, is the multiplication of a vector with a matrix that changes the basis of the vector and also its direction.

**What are Eigenvalues?**

They’re simply the constants that increase or decrease the Eigenvectors along their span when transformed linearly. Think of Eigenvectors and Eigenvalues as summary of a large matrix.^{4}

Generally speaking, eigenvalues and eigenvectors allow us to “reduce” a linear operation to separate, simpler, problems. For example, if a stress is applied to a “plastic” solid, the deformation can be dissected into “principle directions”- those directions in which the deformation is greatest. Vectors in the principle directions are the eigenvectors and the percentage deformation in each principle direction is the corresponding eigenvalue.^{6}

To explain eigenvalues, we first explain eigenvectors. Almost all vectors change direction, when they are multiplied by A. Certain exceptional vectors x are in the same direction as Ax. Those are the “eigenvectors”. Multiply an eigenvector by A, and the vector Ax is a number λ times the original x.

**The basic equation is Ax = λx. The number λ is an eigenvalue of A.**

The eigenvalue λ tells whether the special vector x is stretched or shrunk or reversed or left unchanged—when it is multiplied by A. We may find λ = 2 or 1/2 or −1 or 1. The eigenvalue λ could be zero! Then Ax = 0x means that this eigenvector x is in the nullspace.^{7}

The **eigenvector** and **eigenvalue** represent the “axes” of the transformation.^{8}

## Why

We often want to transform our data to reduce the number of features while preserving as much variance (i.e., the differences among our samples) as we can. Often, you’ll hear folks refer to principal component analysis (PCA) and singular value decomposition (SVD), but we can’t appreciate how these methods work without first understanding what eigenvectors and eigenvalues are. ^{9}

The eigenvectors of a linear transform are those vectors that remain pointed in the same directions. For these vectors, the effect of the transform matrix is just scalar multiplication. For each eigenvector, the eigenvalue is the scalar that the vector is scaled by under the transform.

Peter Barrett Bryan

See Theoretical Knowledge Vs Practical Application.

## How

I don’t show you how to to compute eigenvalues and eigenvectors. Many of the **References**, **Additional Reading**, websites and **YouTube **videos will assist you with that. As some professors say: “It is intuitively obvious to even the most casual observer.”

**Application**

Principal Component Analysis 4 Dummies: Eigenvectors, Eigenvalues and Dimension Reduction. 2013. https://georgemdallas.wordpress.com/2013/10/30/principal-component-analysis-4-dummies-eigenvectors-eigenvalues-and-dimension-reduction/.

**References**

^{1} Eigenvectors and Eigenvalues: A deeper understanding. 2021. https://medium.com/analytics-vidhya/eigenvectors-and-eigenvalues-a-deeper-understanding-c715f8ded4c7.

^{2} Kumar, Ajitesh. 2020. “Why & When To Use Eigenvalues & Eigenvectors? – Data Analytics”. *Data Analytics*. https://vitalflux.com/why-when-use-eigenvalue-eigenvector/.

^{3} Real life example of Eigen values and Eigen vectors. 2021. youtube.com. https://www.youtube.com/watch?v=R13Cwgmpuxc.

^{4} “Understanding The Role Of Eigenvectors And Eigenvalues In PCA Dimensionality Reduction.”. 2019. Medium. https://medium.com/@dareyadewumi650/understanding-the-role-of-eigenvectors-and-eigenvalues-in-pca-dimensionality-reduction-10186dad0c5c.

^{5} Eigenvectors and eigenvalues | Chapter 14, Essence of linear algebra. 2021. *youtube.com*. https://www.youtube.com/watch?v=PFDu9oVAE-g.

^{6} Algebra, Linear, Cantab Morgan, and Linear Algebra. 2009. “Practical Uses For Eigenvalues”. *Physics Forums | Science Articles, Homework Help, Discussion*. https://www.physicsforums.com/threads/practical-uses-for-eigenvalues.312625/.

^{7} “Introduction To Linear Algebra, 5Th Edition”. 2021. *math.mit.edu*. http://math.mit.edu/~gs/linearalgebra/.

^{8} “An Intuitive Guide To Linear Algebra – BetterExplained”. 2021. *betterexplained.Com*. https://betterexplained.com/articles/linear-algebra-guide/.

^{9} Bryan, Peter Barrett. “Eigen Intuitions: Understanding Eigenvectors And Eigenvalues”. 2022. *Medium*. https://towardsdatascience.com/eigen-intuitions-understanding-eigenvectors-and-eigenvalues-630e9ef1f719.

**Additional Reading**

Ahmed, Mansoor. 2021. “Eigen Decomposition”. *Technologies In Industry 4.0*. https://www.technologiesinindustry4.com/2021/12/eigen-decomposition.html.

Eigen decomposition is very important in linear algebra. It is a factorization of a matrix into a canonical form. The matrix is denoted in terms of its eigenvalues and eigenvectors. Merely diagonalizable matrices may be factorized in this manner. The decomposition is named spectral decomposition when the matrix being factorized is a normal symmetric matrix.

“An Intuitive Guide to Linear Algebra”. 2021. *betterexplained.com*. https://betterexplained.com/articles/linear-algebra-guide/.

Algebra, Linear, and MIT OpenCourseWare. 2021. “Linear Algebra”. *MIT Opencourseware*. https://ocw.mit.edu/courses/mathematics/18-06-linear-algebra-spring-2010/index.htm.

This is a basic subject on matrix theory and linear algebra. Emphasis is given to topics that will be useful in other disciplines, including systems of equations, vector spaces, determinants, eigenvalues, similarity, and positive definite matrices.

“Eigenvectors And Eigenvalues Explained Visually”. 2021. *Explained Visually*. https://setosa.io/ev/eigenvectors-and-eigenvalues/.

Eigenvalues/vectors are instrumental to understanding electrical circuits, mechanical systems, ecology and even Google’s PageRank algorithm. Let’s see if visualization can make these ideas more intuitive.

Erik Cheever, Swarthmore College. 2021. “Eigenvalues And Eigenvectors”. *lpsa.swarthmore.edu*. https://lpsa.swarthmore.edu/MtrxVibe/EigMat/MatrixEigen.html.

“Introduction To Eigenvalues And Eigenvectors (Video) | Khan Academy”. 2021. *Khan Academy*. https://www.khanacademy.org/math/linear-algebra/alternate-bases/eigen-everything/v/linear-algebra-introduction-to-eigenvalues-and-eigenvectors.

“Lecture 21: Eigenvalues And Eigenvectors | Video Lectures | Linear Algebra | Mathematics | MIT Opencourseware”. 2021. *ocw.mit.edu*. https://ocw.mit.edu/courses/mathematics/18-06-linear-algebra-spring-2010/video-lectures/lecture-21-eigenvalues-and-eigenvectors/.

If the product *Ax* points in the same direction as the vector *x*, we say that *x* is an *eigenvector* of *A*. Eigenvalues and eigenvectors describe what happens when a matrix is multiplied by a vector. In this session we learn how to find the eigenvalues and eigenvectors of a matrix.

Eigenvector and Eigenvalue. 2021. https://www.mathsisfun.com/algebra/eigenvalue.html.

“Eigenvalues And Eigenvectors”. 2021. *Medium*. https://towardsdatascience.com/eigenvalues-and-eigenvectors-378e851bf372.

I have learned about eigenvalues and eigenvectors in University in a linear algebra course. It was very dry and mathematical, so I did not get, what it is all about. But I want to present this topic to you in a more intuitive way and I will use many animations to illustrate it.

Free Linear Algebra textbook . 2021. https://joshua.smcvt.edu/linearalgebra/.

“Eigenvector And Eigenvalue”. 2021. *mathsisfun.com*. https://www.mathsisfun.com/algebra/eigenvalue.html.

Sterling, Mary. *Linear Algebra For Dummies*. Wiley Publishing, Inc., 2009.

“Visualizing Eigenvalues And Eigenvectors”. 2019. *Medium*. https://towardsdatascience.com/visualizing-eigenvalues-and-eigenvectors-e2f9e3ac58d7.

Eigenvalues and Eigenvectors are a very important concept in Linear Algebra and Machine Learning in general. In my previous article, I’ve been introducing those concepts in terms of Principal Components Analysis, providing practical examples. In this article, I’m going to dwell more on the maths behind those concepts, providing a geometric interpretation of what I’m about to explain.

“Eigen-WHAT?” 2021. *Medium*. https://medium.com/@dpedrazatrivino2/eigen-what-583125ddc984.

If you are reading this, you probably had a linear algebra class and remember your professor mentioning two strange words: *Eigenvector* and *Eigenvalue*. Maybe you remember the equation. However, WHAT ARE THEY?

Ming, Albert. “Part 1: Matrix Definitions”. 2022. *Medium*. https://albertming88.medium.com/part-1-matrix-definitions-7bb61c846d95.

Ming, Albert. “Part 2: Vectors”. 2022. *Medium*. https://albertming88.medium.com/part-2-vectors-bacbca5ad20.

Ming, Albert. “Part 3: Simple Matrices And Transformations”. 2022. *Medium*. https://albertming88.medium.com/part-3-simple-matrices-and-transformations-f90f43c99992.

Ming, Albert. “Part 4: Determinants”. 2022. *Medium*. https://albertming88.medium.com/part-4-determinants-b5a9ed88e40c.

Ming, Albert. “Part 5: Revisiting Systems Of Linear Equations”. 2022. *Medium*. https://albertming88.medium.com/part-5-revisiting-systems-of-linear-equations-ebe2beef6cd3.

Ming, Albert. “Part 6: Eigenvectors And Eigenvalues”. 2022. *Medium*. https://albertming88.medium.com/part-6-eigenvectors-and-eigenvalues-d94f507c4962.