[HN Gopher] What is an eigenvalue?
       ___________________________________________________________________
        
       What is an eigenvalue?
        
       Author : RafelMri
       Score  : 164 points
       Date   : 2022-11-08 10:29 UTC (12 hours ago)
        
 (HTM) web link (nhigham.com)
 (TXT) w3m dump (nhigham.com)
        
       | raydiatian wrote:
       | A far easier to digest primer on eigenvalues is available from
       | 3Blue1Brown [+]. His presentation format is undeniably
       | approachable, so much so that I think you could probably use it
       | to teach linear algebra and eigenvectors to 9-year olds.
       | 
       | [+] https://youtu.be/PFDu9oVAE-g
        
       | mpaepper wrote:
       | I explained it in a more coding oriented style here:
       | https://www.paepper.com/blog/posts/eigenvectors_eigenvalues_...
        
       | [deleted]
        
       | hdjjhhvvhga wrote:
       | For a completely different approach, see this answer:
       | 
       | https://www.reddit.com/r/explainlikeimfive/comments/1avwm7/c...
        
       | oifjsidjf wrote:
       | I was blown away in my Digital Signal Processing (DSP) class that
       | eigen "values" exist for certain systems in the form of "waves".
       | 
       | Basicaly you put in a wave made from multiple sine and/or cosine
       | waves through some function f(x) and the output is STILL a wave,
       | though its frequency, amplitude and phase might change.
       | 
       | Technicaly if I remember correctly this applies to all complex
       | exponentials, since those can be rewritten in the form of e^(ix)
       | = cosx + i*sinx.
       | 
       | This formula also beatifuly shows how rotations and the complex
       | exponentials are connected.
       | 
       | So basicaly you don't just have eigen values, eigen vectors: you
       | also have eigen FUNCTIONS (sine and cosine above are the eigen
       | functions of f(x)).
       | 
       | DSP basicaly revolves arounds functions that don't "corrupt"
       | wave-like inputs (wave in -> wave out).
        
         | WastingMyTime89 wrote:
         | I'm not sure I understand but it seems to me you are just
         | talking about eigenvalues in C.
         | 
         | That's interesting but not particularly remarkable because
         | eigenvalues are defined for linear transformations of any
         | vector space over a field.
        
         | mrfox321 wrote:
         | https://en.wikipedia.org/wiki/Spectrum_(functional_analysis)
        
         | bobbylarrybobby wrote:
         | If you're into eigenfunctions, pick up any textbook on quantum
         | mechanics. The hamiltonian is a linear operator whose
         | eigenfunctions are the stationary states of the system
         | ("stationary" because an isolated system in a stationary state
         | will never leave that state) and whose eigenvalues are the
         | observable values of the energy of the system. In general,
         | there is a correspondence between observable quantities and
         | Hermitian operators on wavefunctions: "measurement" is the
         | application of a Hermitian operator to the wavefunction, and
         | the values you may observe are the eigenvalues of the operator.
         | So, for instance, energy is quantized in some systems because
         | their hamiltonian has discrete eigenvalues.
        
         | sverona wrote:
         | In fact functions are just infinite-dimensional vectors. Almost
         | all of the theory goes through unchanged. This is the basic
         | idea of functional analysis.
        
           | constantcrying wrote:
           | Notably, one _very_ important part which does not go through
           | is that for mappings between infinite dimensional spaces
           | linearity does not imply continuity. (E.g. a series of
           | functions bounded by a constant can have arbitrarily large
           | derivatives)
           | 
           | A large part of functional analysis is dealing with that fact
           | and its implication for PDEs.
        
           | OJFord wrote:
           | Or rather size-of-their-domain-dimensional?
        
             | ravi-delia wrote:
             | If you want, but you can do better. I believe, for
             | instance, that at least continuous functions on the reals
             | have a countable basis. Might even be as strong as
             | measurable, not sure about that. That's how, for instance,
             | fourier transforms work.
        
               | constantcrying wrote:
               | The fourier Transform essentially means that every L^2
               | (the square of the function has a finite integral) is
               | completely "described" by an l^2 series (a series of
               | numbers whose sum of squares converges), which is about
               | the greatest piece of magic in mathematics. One _very_
               | important piece here is that the term  "function" is
               | somewhat of a lie (since the result couldn't be true if
               | it weren't).
               | 
               | The result for measurable _functions_ (not almost
               | functions) shouldn 't be true (I think). I am not even
               | sure it is true for L^1 almost functions.
        
         | [deleted]
        
         | c-baby wrote:
         | Maybe I'm missing what's interesting about this, but a function
         | like f(z) = 5z + 2 would output a wave with changed amplitude
         | and phase when z = sin(x). That doesn't seem that interesting
         | to me, so f(z) must have some other interesting properties?
        
           | oifjsidjf wrote:
           | Honestly I forgot the details, but basical the ENTIRE field
           | of DSP stands on this fact.
           | 
           | Basicaly there exist some functions into which you can feed
           | in sound waves and the output is guaranteed to still be a
           | sound wave.
           | 
           | If you'd feed in a sound wave and if the function would
           | corrupt it you would not be able to do any digital signal
           | processing, since the output must be a wave.
           | 
           | Sound(wave) in -> Sound(wave) out, guaranteed to always be
           | true.
        
             | lisper wrote:
             | > there exist some functions into which you can feed in
             | sound waves and the output is guaranteed to still be a
             | sound wave.
             | 
             | That in and of itself does not seem like a particularly
             | insightful observation. It's just _obvious_ that such
             | functions exist. I can think of three of them off the top
             | of my head: time delay, wave addition, and multiplication
             | by a scalar. There must be something more to it than that.
        
               | brendanclune wrote:
               | In math, the obvious things aren't always true and the
               | true things are often not obvious.
               | 
               | Trivially, the identity f(x) = x satisfies the guarantee
               | as well. What amounts to insightful observation is the
               | definition and classification of these functions. In
               | exploring their existence in various forms, we can begin
               | to understand what properties these functions share.
               | 
               | So the interesting part is not that this class of
               | function _exists_, because of course it does! Your
               | intuition has led you to three possible candidates. But
               | if we limit ourselves to only the functions that satisfy
               | the condition _wave-in implies wave-out_ what do they
               | look like as a whole? What do these guarantees buy us if
               | we _know_ the result will be a wave? For example, f(g(x))
               | is also guaranteed to be _wave-in-wave-out_. Again, maybe
               | obvious, but it's a building block we can use once we've
               | proved it true.
        
               | c-baby wrote:
        
             | canadianfella wrote:
        
           | geysersam wrote:
           | It's even worse than you describe it!
           | 
           | f needs to be linear, but the function in your example is not
           | linear.
           | 
           | However, there are quite interesting linear functions.
           | Example: f(x(t)) = x(t-2) + 4dx/dt - \int_0^t 2x(s) ds
        
             | c-baby wrote:
             | 5z + 2 is linear?
        
               | lp251 wrote:
               | affine, not linear. describes a line that doesn't go
               | through the origin. that pesky shift breaks linearity
               | 
               | 5(2z) + 2 != 2(5z + 2)
        
         | topaz0 wrote:
         | Small correction: eigenfunctions are analogous to eigenvectors,
         | not eigenvalues. In fact they _are_ eigenvectors, in the sense
         | that they are vectors in a vector space of functions (or some
         | restricted set of functions, e.g. continuous functions or
         | periodic functions).
        
       | [deleted]
        
       | [deleted]
        
       | adamnemecek wrote:
       | All these points fail to mention that they are fundamentally
       | self-relationship
       | 
       | Lawvere's fixed point theorem is I think the best formulation of
       | the idea
       | https://ncatlab.org/nlab/show/Lawvere%27s+fixed+point+theore...
       | 
       | I've been putting together a brain dump on the topic
       | 
       | https://github.com/adamnemecek/adjoint/
       | 
       | Join the discord
       | 
       | https://discord.gg/mr9TAhpyBW
        
       | kiernanmcgowan wrote:
       | Eigen-things can also be thought of "fixed values" of a "thing"
       | transformation.
       | 
       | For example - the eigenfunction of a derivative is e^x since when
       | you run the derivative function on e^x you get.... e^x
        
         | Tainnor wrote:
         | They don't have to be fixed, a scalar multiple is allowed too.
         | e^ax is an eigenfunction of the derivative too.
        
         | marviel wrote:
         | After watching the 3blueonebrown video linked below in the
         | comments, I'm inclined to agree with you -- nice way of putting
         | it.
        
       | [deleted]
        
       | AdamH12113 wrote:
       | Does anyone know of an example of a simple physical system where
       | eigenvalues have a physical interpretation? The examples I know
       | of are all in quantum mechanics, which is a bit abstract for me.
        
         | 6gvONxR4sf7o wrote:
         | Eigenvalues of covariance matrices are a famous example. You
         | can get PCA from it.
        
         | cat_man wrote:
         | There are a couple of other comments that have mentioned
         | oscillation modes, vibrations, etc. The first 7 pages of this
         | series on sound synthesis might help give an idea of where
         | these might come from:
         | 
         | https://drive.google.com/file/d/12SM0SAOvMq166gc8B1b81Y_S7HP...
         | 
         | The third page in particular shows a plot of "amplitude" versus
         | "frequency" to show the "harmonic spectrum of a sawtooth wave".
         | The "frequencies" correspond to the modes of vibration (i.e.,
         | sine waves of different frequency), which are the
         | "eigenvectors" in this case. The "amplitudes" are the relative
         | contribution of those vibrations to the overall sound, and
         | these correspond to "eigenvalues".
         | 
         | The article is talking purely about constructing sounds via
         | synthesis, so there's not necessarily a linear system
         | associated with it, but there is a connection. Wave equations
         | represented by linear partial differential equations can often
         | be analyzed as a linear system that has these "modes of
         | vibration" (i.e., series of orthogonal sinusoids at different
         | frequencies). If you were to, for example, model a plucked
         | string (like a guitar), you can model the solution as a
         | weighted sum of eigenvectors (in this case, "modes of
         | vibration" or sinusoids of different frequencies). The
         | "weights" would be the eigenvalues, which determine the
         | spectrum and ultimately the timbre of the sound produced.
         | 
         | That might seem more involved, because it's an infinite-
         | dimensional linear system (i.e., the vectors are functions on a
         | interval, rather than finite lists of numbers). It turns out,
         | though, that the finite-dimensional discretization of an
         | infinite-dimensional linear system (i.e., a partial-
         | differential equation approximated by a finite-dimensional
         | linear system) will sometimes have eigenvectors / eigenvalues
         | that have similar features as the infinite-dimensional case.
         | For example, there are certain finite-difference operators that
         | can be written in matrix form whose eigenvectors will work out
         | to be sampled sinusoids.
         | 
         | I'm not totally sure of the history, but I think a lot of the
         | interest in eigenvectors / eigenvalues as a topic in matrix
         | theory originated from this are (i.e., numerical solutions for
         | partial-differential equations that were used to model physical
         | systems).
        
           | CamperBob2 wrote:
           | Wow, that's an awesome introduction to music synthesis.
           | Bookmarking for future referral to others.
        
         | [deleted]
        
         | idiotsecant wrote:
         | Get a sheet of rubber. Grab it in both hands and stretch it.
         | Inspect your sheet and find a line on the sheet that you could
         | draw in with a marker and when you stretched the sheet the line
         | would grow and shrink, but would not change what it was
         | pointing at (probably a line from one of your hands to the
         | other, in this simple example) That is an eigenvector of your
         | sheet stretching transformation. The eigenvalue is how hard
         | you're stretching the sheet.
        
           | einpoklum wrote:
           | I'll upvote any post beginning with "get a sheet of rubber"
           | :-)
        
             | olddustytrail wrote:
             | Except the "how to have sex with a leopard" post, because
             | describing rubber as "protection" in such circumstances is
             | really stretching.
        
         | antegamisou wrote:
         | Most are introduced to the interplay between physics and linear
         | algebra through the study of the mass - spring system where the
         | type (real or complex), sign and amount of the eigenvalues
         | determine its behavior and stability. For example, complex
         | eigenvalues with positive real part indicate an unstable, or
         | chaotic, in terms of amplitude convergence oscillation.
        
         | johnbcoughlin wrote:
         | The speed of sound is the eigenvalue of a particular matrix
         | (the "flux Jacobian") in the Euler equations, the 5-component
         | system of partial differential equations that describe gas
         | dynamics.
        
         | bodhiandphysics wrote:
         | Take an airplane... it's dynamics are described by a series of
         | differential equations. We want to know if it's stable! If the
         | wife values of the dynamics are real and greater than 1 it's
         | unstable. If the eigenvalues are complex and have a modulus
         | greater than 1 it will oscillate instability. If one is equal
         | to one, it will cause everyone to vomit.
        
         | _spduchamp wrote:
         | I read an interview with Australian wire music composer Alan
         | Lamb that a stringed instrument with multiple overtones
         | vibrating on the string can be analyzed by breaking down the
         | vibration into eigenvalues, but I've never found any reference
         | material that explain that. I'm wondering if he was referring
         | to FFT.
        
           | alanbernstein wrote:
           | Complex exponentials are the eigenfunctions of the Fourier
           | transform. In other words, frequency component values are the
           | eigenvalues.
           | 
           | https://en.m.wikipedia.org/wiki/Eigenfunction#Vibrating_stri.
           | ..
        
             | contravariant wrote:
             | That makes no sense, the Fourier transform of a complex
             | exponential is a delta function.
        
               | alanbernstein wrote:
               | Hmm, you're right, that should have been obvious. Thanks
               | for the correction.
        
           | sfpotter wrote:
           | See my other reply.
        
           | auxym wrote:
           | If you discretize the string into a bunch of tiny masses,
           | linked together by a bunch of tiny springs, you can build a
           | mass Matrix (diagonal) M and a Stiffness matrix K (element ij
           | = stiffness of spring that links mass I and mass j).
           | 
           | I can't remember the next part exactly, you can look it up in
           | a textbook, but you multiply the matrices KMK, or similar,
           | and the eigenvalues of this are the natural frequencies of
           | the string. The eigenvectors represent the mode shapes, ie
           | the displacement of each mass element.
           | 
           | The same technique is used in Finite Element Analysis to find
           | the modes and modeshapes of complex structures (a car frame,
           | a bridge, etc)
        
         | bodhiandphysics wrote:
         | Also in computer science.. a web sites page rank is the
         | eigenvalue of the connectivity matrix.
        
           | cscheid wrote:
           | That's not true. The page rank is read from the eigenvector,
           | and is the value associated with the given vertex (ie web
           | page). There are as many page rank values as there are web
           | pages, but only one eigenvector from which to read: the
           | dominant eigenvector of the transition matrix, which is the
           | one with the largest eigenvalue. So, only a single eigenvalue
           | for the entire pagerank computation.
        
             | bodhiandphysics wrote:
             | You're right!!! Acch...
        
         | contravariant wrote:
         | If i recall correctly you can represent a harmonic oscillator
         | as a linear differential equation with a 2x2 matrix. The
         | imaginary part of the eigenvalues of this matrix correspond to
         | the angular frequency of the oscillator.
         | 
         | I like this example because it gives a physical meaning to both
         | eigenvalues and imaginary numbers. It also shows the connection
         | between the sine and cosine and the complex powers of e comes
         | from (since you can show that all three solve the differential
         | equation).
        
         | hn_throwaway_99 wrote:
         | Don't know if this counts as a "physical" system, but Google's
         | original PageRank algorithm famously uses eigenvectors and
         | eigenvalues:
         | https://math.stackexchange.com/questions/936757/why-is-pager...
        
           | williamcotton wrote:
           | I have a brief overview of eigenvectors as a 2D shear
           | transformation in this overview of PageRank:
           | 
           | https://web.archive.org/web/20130728183938/williamcotton.com.
           | ..
        
             | hn_throwaway_99 wrote:
             | Oooh, this is great! Thanks very much.
        
         | FabHK wrote:
         | Take a linear map from some space to itself, and ask:
         | 
         | What lines (through the origin) are mapped back to themselves?
         | Those are the eigenvectors, and the amount by which they're
         | elongated or shortened are the eigenvalues.
         | 
         | So, if we talk about 3d space, and we rotate things - the
         | rotation axis is unchanged. That's an eigenvector (with
         | eigenvalue 1).
         | 
         | If we mirror things - any vector in the mirror plane remains
         | unchanged, that's an eigenvector (with eigenvalue 1), the
         | vector perpendicular to the mirror is unchanged, but flipped,
         | so that's an eigenvector (with eigenvalue -1).
         | 
         | If we dilate everything along the x axis by a factor of 2, say,
         | then the x axis is an eigenvector (with eigenvalue 2), while
         | the y and z axis and any vector in that plane is an eigenvector
         | (with eigenvalue 1). Any other vector is "tilted", so not
         | mapped to itself, so not an eigenvector.
        
           | MichaelZuo wrote:
           | What does 'through the origin' mean in a physical system?
        
             | pbhjpbhj wrote:
             | It means it doesn't matter where it is: you can choose the
             | origin, ie the point you measure from, it is arbitrary. Or
             | another way of saying that is you can move the system to a
             | different set of coordinates and it works in the same way.
             | 
             | ... which means it's probably an imaginary physical system.
             | 
             | Maybe a good physical example is a piece of cloth that
             | warps in 2D, and shrinks, when washed? Eigenvectors would
             | describe the warping (skew, say) and eigenvalues the
             | shrinkage relative to the original warp and weft.
             | 
             | Steve Brunton on YouTube has really good videos on
             | eigenvectors & eigenvalues in context of matrix algebra
             | (and then applied to simultaneous differential equations);
             | https://youtube.com/watch?v=ZSGrJBS_qtc .
        
               | MichaelZuo wrote:
               | Okay, so that explains 'the origin'.
               | 
               | Does 'through the origin' imply motion through 'the
               | origin'?
        
             | mdup wrote:
             | It means the eigenvalues will only give you information
             | about the system relatively to the center of that system.
             | 
             | Before describing any system, it's up to you (your
             | "convention") to assert where is the zero-point of your
             | world and in which directions the axes (x,y,z) are
             | pointing.
             | 
             | For instance, in the real world you can choose your 3D
             | coordinate system such that your mirror, as a physical
             | system, keeps the origin untouched (0,0,0) -> (0,0,0). If
             | you decide the origin is a point on the mirror, the
             | equations will be linear: mirror(X) = AX. However if you
             | setup the origin some point far from the mirror, like the
             | center of your eyes, the equations are no longer linear,
             | but affine: mirror(X) = AX+B. Looking at the values of the
             | "AX" part of the system would reveal you the mirroring
             | plane, but now shifted by an offset of "+B" -- the distance
             | between the mirror and your eyes -- because your choice of
             | coordinates was not leaving the origin intact.
        
             | DennisP wrote:
             | When you're rotating something, the axis of rotation.
             | That's the point that doesn't change in rotation ("maps to
             | itself").
        
             | theGnuMe wrote:
             | Center of mass; the object itself.
        
         | montecarl wrote:
         | I think a system of springs is a good example. I think having a
         | bunch of springs hooked together is a bit abstract so let's
         | instead think of a molecule and model the bonds between the
         | atoms as springs. If you were to squeeze this molecule together
         | or try to pull it apart and then let go, it would vibrate in
         | some complex way. By complex I mean that it wouldn't just
         | bounce back along the direction that you compressed or
         | stretched it.
         | 
         | However, if you write down the matrix of spring constants for
         | the system and solve for the eigenvalues and eigenvectors of
         | this system you can do something special. If you compress or
         | stretch the molecule along the direction of the one of the
         | eigenvectors then let go, the molecule will continue to vibrate
         | along that same direction. The motion will not spread out to
         | all other degrees of freedom. It will also vibrate with a
         | frequency given by the eigenvalue of that eigenvector.
         | 
         | Additionally, any complex vibration of the system can be broken
         | down into a combination of these independent vibrational modes.
         | This is a simple fact because the eigenvectors form an
         | orthogonal basis for the space.
        
         | sampo wrote:
         | > Does anyone know of an example of a simple physical system
         | where eigenvalues have a physical interpretation?
         | 
         | Oscillation modes in mass-spring systems. Here is a simple one
         | with 2 masses and 3 springs, so the matrix is only 2-by-2.
         | 
         | https://math24.net/mass-spring-system.html
         | 
         | With more than 2 masses, you don't need to arrange the masses
         | on a line, but you can have a 2d or 3d arrangement, with
         | interconnecting springs. I am sorry I failed to find an example
         | image.
         | 
         | The theory is explained, for example, around page 479 in this
         | Thornton and Marion Classical Dynamics textbook. But you need
         | to read about Lagrangian mechanics (chapter 7) before it makes
         | sense.
         | 
         | https://eacpe.org/app/wp-content/uploads/2016/11/Classical-D...
        
         | xhkkffbf wrote:
         | Think of a fun house mirror that for the sake of this example
         | make you look twice as tall but 20% skinnier. This can be
         | modeled by a two-by-two matrix with eigenvalues of 2 and 0.8.
         | (Indeed, it will have them on the diagonals which makes it
         | easier to study.)
        
         | syrrim wrote:
         | The vibration of a bell (say) could be modelled by a matrix,
         | with a state vector to represent position, velocity, and
         | acceleration, and the matrix modelling the differential
         | equations describing their evolution over time. The
         | eigenvectors represent a basis of the system, so that we can
         | describe any potential state vector as a sum of eigenvectors.
         | If we do so, then each step of the system can be modelled by
         | multiplying each of these eigenvectors by its corresponding
         | eigenvalue. If an eigenvalue happens to be complex, then we can
         | describe it in phasor form as the product of an amplitude and a
         | angle. The amplitude tells us how it will decay (or amplify)
         | over time. The angle tells us the frequency of oscillation, and
         | thus the note that the bell will typically sound.
        
         | qntty wrote:
         | Natural frequencies of mechanical systems are eigenvalues of
         | it's equation of motion.
        
           | _spduchamp wrote:
           | Any references to help me unpack what you just said there?
        
             | sfpotter wrote:
             | Things that vibrate have natural modes of vibration. A
             | particular vibrational pattern can be decomposed into a
             | time-varying linear combination of these modes. The modes
             | of vibration are eigenfunctions and the frequencies at
             | which they vibrate are the square root of the corresponding
             | eigenvalues.
             | 
             | You can look up a vibrating drum head (circular membrane)
             | for a simple example.
        
             | bumby wrote:
             | _Theory of Vibration with Applications_ by William Thompson
             | and Marie Dillon Dahleh.
             | 
             | Say you have two cars linked, with some spring constant;
             | 
             | | --^^-- [c1] --^^-- [c2] --^^--|
             | 
             | where '^^' is a spring and '|' is a wall.
             | 
             | The motion of these cars can be written using the spring
             | forces in the system or, alternately, as the harmonic
             | motion of the undamped system with some natural frequency.
             | 
             | Setting this up as two simultaneous equations (one for each
             | car) and solving for the roots give you the eigenvalues.
             | The natural frequency is the square root of the eigenvalue.
             | In other words, the eigenvalues help you define the natural
             | frequencies which can be used to characterize the motion of
             | the cars in the more complicated spring-mass system.
        
             | wolfi1 wrote:
             | Landau/Lifshitz: "Mechanics" has a chapter on small
             | oscillations
        
               | pvg wrote:
               | That's more of a further packing than an unpacking.
               | Although that totally should be an expression for things
               | that go on for too long: "Can you pack this for me
               | please"
        
         | planede wrote:
         | Moment of inertia tensor [1], and principal axes of rotation
         | [2].
         | 
         | Principle axes are the axes where a weightless body can rotate
         | around without "wobbling". These axes are orthogonal to each
         | other. If a rigid body has I_1 < I_2 < I_3 moments of inertia,
         | then rotation around the first and third axes is stable and
         | rotation around the second axes is unstable.
         | 
         | [1]
         | https://en.wikipedia.org/wiki/Moment_of_inertia#Inertia_tens...
         | 
         | [2]
         | https://en.wikipedia.org/wiki/Moment_of_inertia#Principal_ax...
        
         | atty wrote:
         | When dealing with the Schrodinger equation, the eigenvalues are
         | the energy levels of the quantum system.
        
         | ok123456 wrote:
         | Markov probability matrix where the entries are probabilities
         | of some physical event happening.
         | 
         | The the eigenvectors will be the long term stable state
         | probabilities.
        
           | litoE wrote:
           | Not quite. For a Markov probability matrix, 1 is always an
           | eigenvalue, and all other eigenvalues are less than or equal
           | to 1. For each eigenvalue that is equal to 1 you get a long
           | term stable state probability. These distributions contain
           | disjoint subsets of the states, and the system will converge
           | to one of those subsets, depending on the initial state. The
           | eigenvalues that are strictly less than 1 do not add any
           | information to the long term state of the system. See
           | Stochastic Processes and Their Applications, V4 (1976) pages
           | 253-259. I wrote it while still in grad school.
        
           | cscheid wrote:
           | The values associated with each vertex on the _dominant
           | eigenvector_ (the eigenvector associated with the dominant
           | eigenvalue) are the long-term stable state probabilities.
           | That's from a single eigenvector, not "the eigenvectors".
        
             | ok123456 wrote:
             | Yeah each one is an eigenmode of the system. That's what I
             | meant.
        
         | crdrost wrote:
         | So my actual favorite first example is to do this with
         | Fibonacci numbers as a linear recurrence relation, but that's
         | not really a "physical" interpretation. Let me give you my
         | favorite physical one:
         | 
         | The essence of special relativity is that acceleration is a bit
         | weirder than you think. In particular when you accelerate by
         | amount _a_ in some direction _x_ , even after accounting for
         | the usual Doppler shifts you will find that clocks separated
         | from you by that coordinate, appear to tick at the rate 1 + _a
         | x_ /c2 seconds per second, where c2 is a fundamental constant.
         | Clocks ahead of you tick faster, clocks behind you tick slower
         | (and indeed appear to slow down and approach a 'wall of death,'
         | more technically called an 'event horizon,' at a distance c2/
         | _a_. (This effect is called the 'relativity of simultaneity,'
         | and it is in some sense the _only_ real prediction of special
         | relativity, as the rest of this comment will show--the other
         | effects of 'time dilation' and 'length contraction' are second-
         | order and can be derived from this first-order effect.)
         | 
         | This means that the transformation equations for moving into a
         | neighboring reference frame are not the ones that Galileo and
         | Newton proposed,                   t' = t         x' = x - v t
         | 
         | but slightly modified to (to first order in v, so only
         | considering small velocity changes)                   t' = t -
         | (v/c2) x         x' = x - v t
         | 
         | where _w_ = c _t_ is a measure of time in units of distance
         | using this fundamental constant. How do we generalize and get
         | the full solution? We can do it by looking in the eigenvector
         | basis. Consider new coordinates _p_ = _x_ - c _t_ and _q_ = _x_
         | + c _t_ , given any ( _x_ , _t_ ) you can find a unique ( _p_ ,
         | _q_ ) which describes it and if you want to get back those
         | values you would say _x_ = ( _p_ + _q_ )/2, _t_ = ( _q_ - _p_
         | )/(2 c). But feed these magical coordinates that come from
         | eigenvectors into the above transform and it "diagonalizes",
         | p' = (1 + v/c) p         q' = (1 - v/c) q
         | 
         | and therefore if you want to make a big change in "velocity" c
         | ph (here instead ph turns out to be "rapidity") out of N
         | smaller changes, you can repeat this transform N times with
         | little boosts by v/c = ph/N, and you will stitch together the
         | full Lorentz transform out of little first-order Lorentz
         | transforms:                   p' = (1 + ph/N)^N p = e^ph p
         | q' = (1 - ph/N)^N q = e^{-ph} q
         | 
         | Transforming back and using the hyperbolic sine and cosine,
         | sinh(x) = (e^x - e^{-x})/2, cosh(x) = (e^x + e^{-x})/2, the
         | full formula is                   w' = w cosh(ph) - x sinh(ph)
         | x' = x cosh(ph) - w sinh(ph)
         | 
         | where _w_ = c _t_ is a simple time-in-units-of-meters
         | coordinate. Usually we denote cosh(ph) = g, sinh(ph) = g b,
         | which gives this the more familiar form you 'll find in
         | textbooks, and the identity cosh2x = 1 + sin2x gives a formula
         | g = 1/[?](1 - b2) for the latter... but this 'rapidity form' is
         | in some ways more elegant. Anyway, point stands, from the
         | "first-order" transform you can derive the "full" transform
         | just by building any large velocity change out of an infinite
         | number of infinitesimal velocity changes, and this is the
         | source of the factor g which describes time dilation and length
         | contraction.
         | 
         | Okay, now for physical interpretation. You asked what physical
         | meaning these eigenvalues and eigenvectors of the Lorentz
         | transformation have, and the answer is this: the eigenvalues
         | (1, 1) and (1, -1) of the Lorentz matrix represent _light rays_
         | , the p/q description we came up with above was a description
         | of spacetime in terms of light-ray coordinates where we
         | identify an event at a particular place and time with the light
         | rays that it casts, announcing that the event has happened, in
         | the +x and -x directions. On the negative side, these are also
         | the last light rays that were able to touch the event before it
         | happened, so represent "everything it could have possibly known
         | about" -- there is a space between these two "light cones"
         | which is its "relativistic present," the things that anything
         | which was there at the event cannot know about until the
         | future.
         | 
         | The eigenvalues, exp(ph) = sinh(ph) + cosh(ph) = g + g b =
         | [?][(1 + b)/(1 - b)] and exp(-ph) = [?][(1 - b)/(1 + b)], are
         | the Relativistic Doppler shifts of those light rays. Indeed one
         | can read them as e.g. exp(-ph) = 1/g * 1/(1 + b) , here 1/(1 +
         | b) is the standard Doppler shift formula from nonrelativistic
         | physics and 1/g is the decrease in frequency due to time
         | dilation.
        
         | simplotek wrote:
         | > Does anyone know of an example of a simple physical system
         | where eigenvalues have a physical interpretation?
         | 
         | Yep, vibration modes. Vibration frequencies represent their
         | eigenvalues while the shape that the structural system exhibits
         | when subjected to said vibration corresponds to it's
         | eigenvector.
         | 
         | If a structural system is modelled as a linear elastic system
         | it's possible to apply an eigendecomposition of that system and
         | represent it in terms of linear combinations of it's vibration
         | modes/eigenvector, and consequently we can get very accurate
         | representations by using only a hand-full of these
         | eigenvectors.
         | 
         | You know swing sets? We would start to swing back and forth
         | just by moving our legs in a particular frwquencey, and without
         | much effort we could move more and more? It turns out the
         | frequency we moved our legs was the system's vibration
         | frequency/eigenvalue for the vibration modes/eigenvector
         | representing the we swinging back and forth.
        
           | dr_dshiv wrote:
           | Does this relate to the normal modes or eigenmodes of a
           | system?
           | 
           | Actually, trying to understand how eigenmodes and
           | eigenfrequencies -- which I understand well -- relate to
           | eigenvalues and eigenvectors.
        
             | simplotek wrote:
             | > Does this relate to the normal modes or eigenmodes of a
             | system?
             | 
             | Yes. The eigenvalues and eigenvectors of an undamped
             | harmonic oscillator are respectively the vibration
             | frequency and vibration mode.
             | 
             | One major class of structural analysis techniques is modal
             | analysis, which determines the vibration modes and
             | corresponding frequencies of specific structural systems
             | subjected to particular boundary conditions.
        
         | [deleted]
        
       | billfruit wrote:
       | I do think, the term eigenvalue is rather opaque, and should be
       | replaced by a more plain-english terminology that readily conveys
       | its meaning.
        
         | Tainnor wrote:
         | It used to be called "proper value" in English (you can still
         | find that in old textbooks), but the (semi-)German word has
         | basically entirely replaced it.
        
         | st_goliath wrote:
         | As a native German speaker, I don't understand your problem.
         | It's very much not opaque, plain terminology that easily
         | conveys meaning. ;-)
         | 
         | Perhaps, we should compromise and name it after Leonhard Euler?
         | That should clear up the confusion.
        
         | NotYourLawyer wrote:
         | "Characteristic value." I guess that's a little better
         | actually.
        
         | frozenlettuce wrote:
         | In romance languages you have
         | autovettore/autovetor/autovectore, as in "self-vector"
        
         | hdjjhhvvhga wrote:
         | Can you propose one?
        
           | billfruit wrote:
           | Linear scaling factor?
           | 
           | Scale of aspect?
           | 
           | Aspect factor?
           | 
           | Scale along Axis?
           | 
           | Axial scaling factor?
           | 
           | Natural scaling?
           | 
           | Propensity?
           | 
           | Leaning factor?
           | 
           | In geography for example(quoting from Wikipedia):
           | 
           | "In physical geography and physical geology, aspect (also
           | known as exposure) is the compass direction or azimuth that a
           | terrain surface faces."
        
         | gpsx wrote:
         | You know what's another one from physics whose name has nothing
         | to do with the actual meaning - "Gedanken experiment"
        
           | gmfawcett wrote:
           | Huh? It literally translates to "thought experiment" in
           | English, which is exactly what it means.
        
             | jffry wrote:
             | I think you may have gotten whooshed by the joke - both
             | "eigenvector" and "gedankenexperiment" are mashups of a
             | German word and an English word
        
               | Tainnor wrote:
               | "Gedankenexperiment" is a fully German word, though, not
               | a mashup.
        
         | jxy wrote:
         | Try to name the bones you used to type this sentence?
        
       | vmilner wrote:
       | 3blue1brown on this:
       | 
       | https://m.youtube.com/watch?v=PFDu9oVAE-g&vl=en
        
         | raydiatian wrote:
         | Literally all you need.
        
           | vmilner wrote:
           | I think anyone starting a lin alg course could do a lot worse
           | than watch all his "Essence of Linear Algebra" series before
           | starting - then watch the relevant (c. 15 min) episodes as
           | you take each lecture.
        
         | jackconsidine wrote:
         | I was learning principal component analysis a few years back
         | which uses Eigenvectors to reduce feature dimensions while
         | minimizing information loss (sorry if I butchered this)
         | 
         | I was really struggling to grok what Eigenvectors and
         | Eigenvalues were and found this video to be the best intuition
         | primer. I wish I had 3b1b when I was in high school and college
        
       | hackandthink wrote:
       | Machine Learning (LDA):
       | 
       | "By finding eigenvectors we'll find axes of new subspace where
       | our life gets simpler: classes are more separated and data within
       | classes has lower variance."
       | 
       | https://medium.com/nerd-for-tech/linear-discriminant-analysi...
        
         | _gmax0 wrote:
         | Also, if you come from a computing background, I think
         | Eigenfaces is a great, illustrative use of eigenvalues.
         | 
         | https://en.wikipedia.org/wiki/Eigenface
        
       | [deleted]
        
       | Waterluvian wrote:
       | Something that frustrates me, and maybe I'm just confessing my
       | stupidity, is the extra layer of indirection in any discipline
       | when things are named after people and not the thing's
       | characteristics.
       | 
       | My doctor once told me "if you learn enough Latin, a lot of names
       | in medicine will hint at what they are, so you have less to
       | memorize."
       | 
       | I find that these names often lend a sense of complexity to
       | concepts that turn out to be rather simple. In high school this
       | really contributed to my struggles.
       | 
       | Edit: apparently Eigen isn't a person's name so I sure picked an
       | embarrassing moment to bring this up.
        
         | 323 wrote:
         | But you assume that there is one word to describe the
         | characteristics.
         | 
         | If such a word doesn't exist, you might as well name it after a
         | person instead of trying to invent a new word.
        
         | [deleted]
        
         | ravi-delia wrote:
         | I recently embarked on a journey to come up with a math
         | vocabulary for Toki Pona, a lovely little artistic conlang
         | which deserves better than what I'm doing to it. In Toki Pona,
         | words are build up from simpler ones to describe a thing as it
         | is. A friend is 'jan pona', a person who is good (to me, the
         | speaker). So I've had to come up with names which describe math
         | topics.
         | 
         | It's awful.
         | 
         | You know how many same-xs there are?! Eigenvalue, eigenvector,
         | homomorphism, isomorphism, hom _e_ omorphism, homotopic. Which
         | one gets to actually be "same shape"? Worse are when well
         | meaning mathematicians use descriptive names anyway. Open and
         | closed are not mutually exclusive, giving rise to the awful
         | clopen (and don't pretend like ajar helps. an ajar door is an
         | open door!). Groups, rings, and fields all sort of bring to
         | mind the objects they describe, but only after you know the
         | archetypal examples. Math is the study of giving the same name
         | to different things, and that gives rise to more names than
         | there are short descriptions.
         | 
         | So do you know what I did? Whenever I could, I used a real
         | person's name. It freed up a limited vocabulary, and gave
         | enough wiggle room to translate most undergrad math without too
         | much loss. I suspect a similar thing is in play with math.
         | Maybe the category theory people have abstractions to usefully
         | describe "same-functions" without confusion. But in general,
         | things are named poorly because it's genuinely a hard task.
        
         | jpmattia wrote:
         | > _is the extra layer of indirection in any discipline when
         | things are named after people and not the thing's
         | characteristics._
         | 
         | "Eigen" in German has same English root as "own": "Eigenvalue"
         | is Germanglish for "Own/inherent value", so meets your spec of
         | naming a thing after its characteristics, as long as "naming"
         | is allowed to be in multiple languages.
        
           | filmor wrote:
           | It doesn't mean "same". It means "own" in the sense of
           | "inherent" or "characteristic".
        
             | jpmattia wrote:
             | Fair enough, edited.
        
         | DocTomoe wrote:
         | A common language fosters research and common understanding.
         | 
         | In IT, that language is English. In diplomacy, before
         | interpreters were plentiful, that language was French. And in
         | many classical, medieval-era sciences, that language was Latin
         | (as a commonly-understood language that came from it's ease of
         | being learned by romance-language speakers and being rather
         | relevant in the (then church-run) universities).
         | 
         | So, there's no indirection intended. It's just an artefact of
         | the past - an artefact that helps Chinese, Spanish and American
         | doctors communicate (in broad strokes) even today.
        
         | constantcrying wrote:
         | It is sometimes very hard to name things well. The name either
         | becomes so unspecific that it is just as useless, or it gets so
         | long that nobody will use it.
         | 
         | This gets worse the "deeper" the math goes, but _for me_ it
         | never was a real problem, as you usually learn the definition
         | together with the name.
        
           | cogman10 wrote:
           | You see this sort of thing crop up in chemistry.
           | 
           | For really simple compounds, names are more or less settled
           | and consistent (with some exceptions).
           | 
           | But as soon as your compound starts to get more complex
           | (think organic chemistry) all the sudden, it becomes nigh
           | impossible to consistently name things. There are tons of
           | compounds with the same chemical formula that are regionally
           | named differently. Even worse, there are tons of compounds
           | with the same chemical formula that are actually different
           | things due to how the compound is arranged. (Good ole carbon
           | chains).
        
         | martin_balsam wrote:
         | But it is named after its characteristic, albeit in German
        
           | Waterluvian wrote:
           | Well... boy did I pick the wrong example to bring this up
           | with. Alas, I'll leave my shame here for all to see.
        
         | sfpotter wrote:
         | If you learn a lot of math, a lot of names will hint at what
         | they are so you have less to memorize. :-)
        
           | Waterluvian wrote:
           | Except when some smart jerk discovered like eight different
           | things!
        
             | jpmattia wrote:
             | And then we have Grothendieck's prime (57), just to keep
             | life interesting.
        
             | BlueTemplar wrote:
             | Mandatory not-Euler's :
             | 
             | https://en.wikipedia.org/wiki/List_of_things_named_after_Le
             | o...
        
             | jks wrote:
             | My favorite is the "Lemma that is not Burnside's". Also
             | known as the orbit-counting theorem, the Polya-Burnside
             | lemma, the Cauchy-Frobenius lemma, and of course Burnside's
             | lemma.
        
             | ragnese wrote:
             | Or when a smart jerk discovered a thing, and then
             | discovered another thing based on the first thing:
             | https://en.wikipedia.org/wiki/Ramond%E2%80%93Ramond_field
        
       | ProjectArcturis wrote:
       | Who is this explainer aimed at? If you can understand the first
       | sentence, you probably already know what an eigenvalue is.
        
         | [deleted]
        
         | 3qz wrote:
        
         | Jyaif wrote:
         | Right, but it's great to refresh your memory about eigenvalues.
        
         | CamperBob2 wrote:
         | The thing about Higham is that he's sort of a one-man Wikipedia
         | of linear algebra. Many of the terms that he uses also have
         | their own pages that (eventually) break the concepts down into
         | comprehensible terms.
         | 
         | See https://nhigham.com/index-of-what-is-articles/ for a useful
         | listing. Or, in an alternative form,
         | https://github.com/higham/what-is . Notice that if you go all
         | the way back up the rabbit hole you'll find user-friendly
         | articles like "What is a matrix?" that clearly define the terms
         | used farther down.
         | 
         | I really dig Higham's pedagogic style, in case it's not
         | obvious.
        
         | techwizrd wrote:
         | Often, papers or terse textbooks will list a definition like
         | the first sentence without the added detail below. I think this
         | is great for undergraduate students or folks who'd like to
         | refresh their memory a bit on eigenvalues, how they're derived,
         | and what they may imply. I certainly found it helpful.
        
       | Kalanos wrote:
       | You lost us at lambda
        
       ___________________________________________________________________
       (page generated 2022-11-08 23:00 UTC)