[HN Gopher] The Fourier Transform, explained in one sentence (2014)
       ___________________________________________________________________
        
       The Fourier Transform, explained in one sentence (2014)
        
       Author : signa11
       Score  : 392 points
       Date   : 2023-01-15 12:54 UTC (10 hours ago)
        
 (HTM) web link (blog.revolutionanalytics.com)
 (TXT) w3m dump (blog.revolutionanalytics.com)
        
       | philippejara wrote:
       | Just like "A monad is just a monoid in the category of
       | endofunctors" it'll only make sense and click once you already
       | understand the concept, at least the haskell one was satirical.
       | Those simple phrases are I admit a fun experiment to see how much
       | you can compact concepts, sometimes it is actually useful to come
       | up with new words to represent common phenomenons in a given
       | field to use with the people that already know of the
       | concepts(hence why you pretty much need to "expand" almost every
       | couple words of the phrase for the laymen to understand).
       | 
       | There is value in doing this expansion to teach the laymen, and
       | the author does make the expansion somewhat in his post to be
       | fair.
        
       | motohagiography wrote:
       | There should be a competition for pithy, colour coded layman
       | explanations of theorems. This is everything I needed to know
       | about it, especially after watching the wavelets video posted to
       | HN a few weeks ago: https://www.youtube.com/watch?v=jnxqHcObNK4
        
       | hprotagonist wrote:
       | "most of the time, you can take apart a continuous signal into
       | the sum of a bunch of sine waves. The fourier transform tells you
       | how tall and how shifted each of those sine waves are for a given
       | signal, for as many of those sine wave components as you care to
       | calculate. If you want those summed sine waves to usefully model
       | your input, you'll need to add up sine waves that are wiggling at
       | least two times as fast as the most rapidly changing feature you
       | want to capture is. "
        
       | onos wrote:
       | I see this as a confusing explanation of the algorithm for
       | calculating coefficients, not as a line that explains what the
       | Fourier transform is.
       | 
       | As for a single line on intuition, can you beat: "method of
       | decomposing a function into a sun of sines and cosines?"
        
         | hgsgm wrote:
         | Then you have to get into why you are talking about sines and
         | cosines, adding unnecessary complexity to the explanation.
         | 
         | The OP is better. It's the frequency that matters for intuiook,
         | the shape of the periodic function is a technical detail.
        
           | cousin_it wrote:
           | Sines and cosines are the best choice here because they're
           | the "elementary periodic function" for a given frequency: the
           | sum of two (possibly scaled and shifted) sinusoids at some
           | frequency is also a (scaled and shifted) sinusoid at the same
           | frequency. For other shapes of periodic function it's not
           | true: for example, if you sum two triangle waves of the same
           | frequency but offset from each other, you'll get a shape
           | that's more complex than a triangle wave. Same for two square
           | waves etc. So the most natural way to decompose a signal is
           | into a sum of sinusoids rather than triangles or squares etc.
        
             | PaulDavisThe1st wrote:
             | Minor note: "sinusoids" (as you say yourself in your final
             | sentence) is enough. No need to mention sines & cosines as
             | distinct types.
        
       | Etheryte wrote:
       | A good example of an explanation that only makes sense once you
       | already know the thing being explained.
        
         | [deleted]
        
       | mejutoco wrote:
       | If anyone wants to understand the Fourier transform, I can
       | recommend the book "Who is Fourier: a mathematical adventure". It
       | is supposed to be a children's book. It was very clear and
       | intuitive.
       | 
       | https://www.goodreads.com/book/show/6969906-who-is-fourier
        
       | kristopolous wrote:
       | I found that animation and visuals are really helpful here for
       | people who struggle with it. There's plenty of tutorial videos on
       | YouTub
       | 
       | An epistemological hurdle that really helped me was realizing
       | these things like frequency and time domain aren't "real" as much
       | as they are extremely useful mathematical models for manipulating
       | things.
        
       | symmetricsaurus wrote:
       | Instead of energy, it should say amplitude(and phase since it's
       | an inherently complex thing).
        
       | benreesman wrote:
       | IMHO an under-explored bit of intuition is _crops_ in the
       | frequency domain.
       | 
       | If you want to try this for yourself grab a megabyte of photos of
       | "conventionally attractive" selfies and a megabyte of selfies
       | chosen at random and gzip them.
       | 
       | I'm personally not all that symmetrical so I don't compress all
       | that well. But I suspect you'll find that "pretty" people
       | compress better.
       | 
       | Same experiment with pop music vs serious jazz, same result.
       | 
       | I have a pet theory that everything from caffeine to alcohol is
       | cropping in the Fourier domain.
       | 
       | It's a fun experiment, try it out!
        
         | amuresan wrote:
         | What do you mean by crop in frequency domain? Low pass
         | filtering i.e. removing higher frequency components?
        
           | benreesman wrote:
           | I hope I adequately advertised this as personal metaphysics,
           | certainly I don't have figures that I can share.
           | 
           | Asymmetry, blemish, teeth a different color than the corpus
           | all consume entropy in the Shannon sense.
           | 
           | In a very real sense, movie stars are the most boring people
           | to take still photos of.
        
         | hgsgm wrote:
         | I don't believe that, especially because jpeg is _already
         | compressed using Fourier analysis_ before you gzip it using
         | _non-Fourier compress_.
        
           | benreesman wrote:
           | I was being a bit flip when I said gzip, a lossless mechanism
           | is unlikely to give much insight!
           | 
           | I should have said JPEG or 264 or something: a DCT is what I
           | meant.
           | 
           | Thank you for the correction.
        
       | KineticLensman wrote:
       | > "To find the energy at a particular frequency, spin your signal
       | around a circle at that frequency, and average a bunch of points
       | along that path"
       | 
       | You're going to need a longer sentence...
       | 
       | [Edit] If this sentence genuinely did what it said on the tin,
       | the rest of the article - which give a lot more detail - wouldn't
       | be necessary.
        
       | raydiatian wrote:
       | https://youtu.be/spUNpyF58BY
       | 
       | The last Fourier Transform explanation you will ever need.
        
       | rurtrack wrote:
       | Ever saw those EQ lines jumping around as the time goes by and
       | music sounds? That is the direct result of drawing the
       | information the FFT gives.
       | 
       | There are many interesting applications, for example, voice
       | recognition https://towardsdatascience.com/understanding-audio-
       | data-four...
       | 
       | It is really amazing
        
         | nixpulvis wrote:
         | Maybe it's just because I'm sorta an audio guy, but this is how
         | I first understood it.
        
       | shreyshnaccount wrote:
       | I find this kind of math translated to natural language very
       | useful to visualise things. It adds tremendous value for someone
       | like me. The comments saying that if the sentence was enough you
       | wouldn't need the article are in a very "ha gotchu" way and it's
       | counter to both the people who want to understand it and the
       | spirit of HN
        
       | fortituded0002 wrote:
       | I love the color coded approach for this, but to problems for me:
       | 
       | 1. I can't tell the difference between the pink and the purple in
       | the formula. The text is fine, but the formula isn't. 2. I have
       | no idea what "signal" means in the "spin your signal" so the
       | single sentence explanation is completely useless to me.
       | 
       | This bums me out a bit because I think there's something really
       | great here.
        
       | meindnoch wrote:
       | Or just say that the (complex) amplitude at a particular
       | frequency is the dot product of the signal and a (complex)
       | sinusoid of that frequency.
       | 
       | Then you'll immediately understand how the Fourier transform can
       | be just a change of basis (a matrix multiplication in the
       | discrete case).
        
       | voidhorse wrote:
       | I think the best semi-intuitive, non rigorous explanation I've
       | seen of the Fourier transform is still one that first explained
       | signal correlation in the time domain and then described the
       | transform as basically performing correlation on the signal for
       | all the possible sines at different frequencies. Essentially
       | you're just testing for the presence of individual sine waves (of
       | different frequencies) within the signal.
       | 
       | Unfortunately I can't remember where I saw this explanation. It
       | might've been in the Scientist's and Engineers Guide to DSP book.
        
         | tonmoy wrote:
         | To me the sentence in the OP is the same as the sentence you
         | posted (your sentence is easier to parse since I already know
         | the meaning of correlate)
        
         | teldau wrote:
         | I agree, FFT is simply a correlation of a bunch of sine
         | waveforms.
         | 
         | The extra tidbit - the sine waveforms are orthogonal to each
         | other (their cross correlation is zero), so the transform can
         | be inverted. (in math terms: they form a basis set. I'm mostly
         | referring the STFT here, which is the 'common' FFT use. not
         | infinite FFT).
        
         | wankle wrote:
         | That's the best definition I've seen either in the OP post or
         | any comment I've read on the page, it's exactly how I think of
         | it.
        
         | ajross wrote:
         | > best semi-intuitive, non rigorous explanation [...] signal
         | correlation in the time domain [...] basically performing
         | correlation on the signal for all the possible sines at
         | different frequencies
         | 
         | That's only going to make sense for someone who understands
         | your jargon usage of "correlate", _and_ who groks that
         | integrals of sine curves are orthogonal under addition. That 's
         | precisely the hard part the linked explanation is trying to
         | explain.
         | 
         | If you've already gotten that far then the DFT is just
         | calculator math.
        
           | voidhorse wrote:
           | True! Though tbh, I did not even know that property of
           | integrals, since I've studied this stuff only in the discrete
           | realm.
           | 
           | At the totally basic level, another thing that I think can
           | really trip people up and that a lot of texts are not clear
           | about is that the transform is about moving between
           | _representations_ and that we 're not actually transforming
           | anything. Not enough maths texts take the time to explain how
           | the representation connects up with the objects we wish to
           | study and how there are multiple possible ways to represent
           | them, and in some cases, like Fourier, useful ways to
           | "translate" between different representations.
        
       | Misdicorl wrote:
       | IME explanations of the Fourier transform focus far too much on
       | the specific mechanics of it without motivating the reason. I
       | find people often don't even realize the time space and frequency
       | space functions are the same function! I dive in like so
       | 
       | "There are many different ways to write down the number 5. Tally
       | marks, the sigil 5, 4+1, 2.5 times 2, 10/2. Each of them is more
       | or less useful in different circumstances. The same is true of
       | functions! There are many different ways to write the same
       | function. The mechanics of taking a function in one form and
       | writing it in a new form is called a transformation."
       | 
       | Above needs to be said a hundred times. _Then_ you talk about why
       | a frequency space function can be useful. _Then_ you talk about
       | how to transform a time space function to the frequency space
       | representation.
        
       | m3kw9 wrote:
       | We need an explain everything in one sentence website.
        
       | itcrowd wrote:
       | This is the discrete Fourier transform (DFT) not _the_ Fourier
       | transform.
        
         | marosgrego wrote:
         | For the continuous Fourier transform, you would just take a
         | continuous average, i.e. an integral.
        
       | queuebert wrote:
       | Title should say "The _Discrete_ Fourier Transform, explained in
       | one sentence ".
        
       | prof-dr-ir wrote:
       | Does "blog post learning" ever really work?
       | 
       | I have taught these kind of undergraduate subjects and, _in the
       | context of a course_ , the Fourier transform never struck me as
       | something very complicated. First, it feels completely natural to
       | write down a Fourier decomposition for periodic functions; the
       | inverse transform to determine the coefficient is just (real or
       | complex) calculus; and all that is left is to convince the
       | audience of completeness which is best done through a few
       | examples as well as the time-honoured method of "proof by
       | intimidation". (Note that the blog post does not do a better job
       | here.)
       | 
       | By contrast, these posts seem to fulfil a demand where people
       | look for an _isolated_ explanation of concept X, like Fourier
       | transforms, matrix determinants, monoids, etc. But they are
       | necessarily too isolated to really convey the subject. For
       | example, one does not normally teach (complex) Fourier transforms
       | without first dedicating significant time to topics like complex
       | exponentials and integrals of trig functions, and likewise one
       | normally teaches monoids only after introducing functors and
       | applicatives.
       | 
       | In other words, without having the necessary background at your
       | fingertips it is hard to really grasp the core concepts. That is
       | why, IMHO, reading these blog posts may feel good but will rarely
       | make things really stick.
        
         | mafuyu wrote:
         | I mean, it's no replacement for learning something rigorously,
         | but I still find little snippets like this helpful for building
         | intuition. Frankly, I'm not really great at rigor myself, and
         | for areas that aren't my areas of expertise (like signals &
         | systems), intuition is all I really have to go off of. I took
         | undergrad signals & systems years ago and probably could not
         | say anything coherent about the Fourier transform today. Still
         | had enough intuition kicking around somewhere to get the SNR of
         | this ADC signal down via oversampling and an IIR filter,
         | though. I remember churning through blog posts for things like
         | PID loops and Kalman filters as a teen, and eventually got to
         | the point where I kinda understood them. But then when it came
         | time to actually learning them, having that small bit of
         | background helped immensely, since I didn't have to build that
         | basic intuition in the span of a single semester course.
        
         | kansface wrote:
         | I have a degree in physics, but have been a working programmer
         | for 15 years. For me, intuition is the only thing that remains
         | and that is slipping, too. This post is like visiting an old
         | friend I haven't talked to in as much time.
        
         | jasode wrote:
         | _> I have taught these kind of undergraduate subjects and, in
         | the context of a course, the Fourier transform never struck me
         | as something very complicated. _
         | 
         | It's great that it was "never complicated" for you. In
         | contrast, the author (David Smith) of this blog post _admits
         | that he initially struggled with the Fourier Transform_ -- and
         | wants to share some insights he gained after he understood it.
         | 
         | He has already graduated with a degree in statistics and
         | computer science and also developed add-on software packages
         | for R so he presumably first got exposed to FT within the
         | _context_ of a college course. So he actually did what you
         | suggested. Nevertheless, he wants to share why some
         | _supplementary material_ he didn 't initially have might be
         | helpful to others.
         | 
         | Alternative presentations of topics can sometimes flip the
         | "light bulb" in some brains. I don't think it's a useless blog
         | post.
        
           | soheil wrote:
           | > It's great that it was "never complicated" for you.
           | 
           | Please stop making it sound like the commenter was bragging.
           | That's just having bad faith in HN users.
           | 
           | The comment was about how the context and prior understanding
           | of the underlying topics is a necessary prerequisite for
           | understanding complex topics built on those and how blog
           | learning is a necessarily contrived method of teaching.
           | 
           | It does not mean it won't work for some. If you happen to
           | have the prior understanding of the prerequisites then it'd
           | be a way to learn some additional insight, but you shouldn't
           | assume the Venn diagram for that audience is much larger than
           | a small % of the readers.
        
           | Schiphol wrote:
           | > It's great that it was "never complicated" for you.
           | 
           | I don't think they meant "not complicated to learn", but
           | "never complicated to teach, once its time had arrived in the
           | natural progression of the course".
        
             | Dudeman112 wrote:
             | >but "never complicated to teach, once its time had arrived
             | in the natural progression of the course"
             | 
             | Which is, of course, true for give or take all subjects out
             | there
        
               | adamisom wrote:
               | Indeed: the original comment was not at all implying
               | anything special about FTT. Rather that, perhaps whether
               | we like it or not, knowledge is not very flat; you really
               | should honor (as in: recognize and treat appropriately-
               | nothing magical or moral intended) the difficulty of many
               | topics by doing prerequisite study, or, if you can't,
               | just accepting that you will never have a good
               | understanding.
               | 
               | I don't think it's at loggerheads with just-in-time
               | learning or the real opportunity cost of more extensive
               | study. I think it's just true that hard things are hard
               | whether you like it or not.
        
         | evanwise wrote:
         | I would tend to agree. The only people I saw really struggle
         | with learning Fourier series and the Fourier transform were
         | people who had struggled with more basic concepts and failed to
         | internalize them. This kind of explanation might be helpful to
         | build someone's intuition up a bit but is not a substitute for
         | a proper education on the subject and definitely not some magic
         | key to understanding.
        
         | fifilura wrote:
         | I wish my university courses had started with the intuition
         | behind and utility of the functions we were about to learn. Be
         | it linear algebra, differential equations or Fourier
         | transforms.
         | 
         | This is what all these blog posts excel with.
         | 
         | Of course I got it along the way, but I do believe it would
         | have been easier if the background was something like this.
        
         | jchw wrote:
         | I don't get the point. There really isn't anything particularly
         | inherently special about blog posts as a medium versus say,
         | chapters in a textbook, other than the obvious things. You can,
         | of course, learn about complex subject matter through a blog
         | post. If you find someone with a similar mental model of things
         | to your own, I think you can in fact learn from blog posts
         | easier, as they can get from zero to understanding with less
         | detours.
         | 
         | On a similar note, I personally believe there are many concepts
         | that have been explained much better by random YouTubers than
         | academic textbooks. Textbooks are more comprehensive almost
         | always, but having an intuitive understanding of something is
         | obviously more important in many cases than simply remembering
         | a long list of bullet points about it. You can always use
         | reference material later. In real life, memorizing facts on
         | their own is rarely useful.
         | 
         | The best answer here is that you should use both, or imo, "all
         | three," of blog posts, video content, and academic
         | textbooks/articles/courses. Taking a variety of approaches
         | helps me get to a mental "click" quicker, personally.
        
         | sidlls wrote:
         | This is why it's rare as good Carolina barbecue to find a
         | person self-educated in these topics who also truly understands
         | the material. At least in my experience.
        
         | georgeburdell wrote:
         | I agree. Way back when I learned the FT, I happened to be
         | taking a signals and statistics class in the same semester and
         | rationalizing the Fourier transform as the correlation between
         | a function and a sinusoid was an incredibly natural mental leap
         | and didn't require me to hold much additional knowledge in my
         | head
        
         | anthomtb wrote:
         | Did you miss the first sentence in the post?
         | 
         | > If, like me, you struggled to understand the Fourier
         | Transformation when you first learned about it
         | 
         | No where does the author imply that he is teaching the Fourier
         | Transform to someone who has never heard of it.
         | 
         | I agree with your overall point. Blog posts are not going to
         | teach you difficult mathematics unless they are the length of
         | college textbooks. But a blog post like this is great for
         | reinforcement. If, like me, you are an ECE (Electrical and
         | Computer Engineer) who has written software for the last 15
         | years, an explanation like the one in the post makes a lot of
         | neurons reconnect.
        
         | lattalayta wrote:
         | I feel like I've learned some things from high quality
         | interactive blog posts. These posts from Bartosz Ciechanowski
         | come to mind https://ciechanow.ski/archives/
        
         | osigurdson wrote:
         | I think a lot of people have heard of the Fourier transform and
         | kind of understand how any signal can be represented by a
         | superposition of sine waves. Attempting to improve the
         | intuition of how the algorithm works isn't a bad thing. The
         | casual reader isn't going to invest the time to learn the pre-
         | requisites so the alternative is nothing at all.
        
         | ivanhoe wrote:
         | There's a lot of math & physics subjects that I formally
         | learned and passed exams at Uni, but frankly never really fully
         | understood them. I've found clarifications on many of these
         | subjects later on web, in exactly these types of blogs or
         | youtube videos. All you need is one or two core concepts better
         | explained and suddenly you can connect all the dots, and that
         | what this type of content is fantastic for.
        
         | j7ake wrote:
         | I find them excellent if you've learned the technical details
         | already in a book or by yourself. Blog posts often are high
         | level and can give alternative perspectives on the same
         | problem.
         | 
         | They're also excellent to motivate a problem so the reader can
         | find the technical details elsewhere.
        
         | avip wrote:
         | I think you're confusing _proving_ with _understanding_. The
         | two are almost orthogonal.
        
           | posix86 wrote:
           | Disagree. If you truly understand something, you must have
           | prooven it to yourself, everything is just a mental help
           | ("donkey bridge" in german) that allows you to remember the
           | statement better - weather it's right or not, you can only
           | know once you've prooven it; and before you did that, it can
           | often happen that your intuition on what's right is actually
           | wrong.
        
             | chongli wrote:
             | I think a lot of people can understand simple statements in
             | math such as Fermat's Last Theorem or the Collatz
             | Conjecture but proving them is an entirely different
             | matter. While it may be the case that proving some
             | statements is sufficient to understanding them, I would say
             | it's _never_ necessary.
        
               | greenpeas wrote:
               | I think you may be confounding different kinds of
               | _understanding_. There 's a big difference between
               | understanding what a statement is saying, versus
               | understanding why a statement must hold true. And
               | understanding why a statement holds doesn't necessarily
               | have to be a formal proof, it just means that you have
               | convinced yourself that some concepts that you are
               | familiar with behave in a way to support the statement.
               | It can actually happen that following a _correct_ formal
               | proof doesn 't automatically help you understand why
               | something is true. And vice-versa, it may happen that
               | when you try to formalize your understanding, you
               | discover that you missed some crucial details.
               | 
               | But usually _true_ understanding and formal proofs go
               | hand in hand.
               | 
               | P.S: Though as a disclaimer, I don't know maths beyond
               | the undergraduate level, and I'm sure there must be very
               | complicated concepts that mathematicians don't
               | understand, but can reason formally about, as well as
               | concepts that mathematicians feel that they intuitively
               | understand but can't quite prove.
               | 
               | Additionaly, there's von Neuman saying "Young man, in
               | mathematics you don't understand things. You just get
               | used to them"
        
             | adlpz wrote:
             | That makes no sense to me. Or maybe your definition of
             | "understand" is completely unrelated to mine.
             | 
             | Under your thesis one cannot "understand" anything physical
             | (as in physics, chemistry, etc) as the underlying cannot be
             | "proven", at all, by definition.
        
           | mxkopy wrote:
           | I think you're confusing understanding with intuition. Which
           | is fair; I'd say understanding builds to intuition as it's
           | accessed more. The line is blurry at points.
        
         | antegamisou wrote:
         | At last someone said it, thanks.
         | 
         | It's bothersome how popularized the notion
         | I'm one,two,three,...,n blog posts/YouTube videos away from
         | fully grasping this complex mathematical concept which not only
         | requires fundamental knowledge I may not be familiar with, but
         | also solving problems/writing code where at first I'll have no
         | idea how is it supposed to be of any use but over time it will
         | eventually 'click'.
         | 
         | has become.
         | 
         | Beautiful animations and witty narrations are nice, but you're
         | in for a (bad) surprise if you think they're adequate. Also
         | nothing beats good old book -> pencil/paper.
        
         | Snowbird3999 wrote:
         | What works for me is learning a single concept in many
         | different ways. A textbook, a youtube video, a friend
         | explaining it to me... Each next source of material (however
         | casual or formal it is) is more likely make the concept
         | "click", or allow me to understand the concept more
         | efficiently.
         | 
         | To that extent, this blog post seems valuable to me. The author
         | is using a language and colors to convey an intuition that I
         | have not precisely seen before.
        
         | crispyambulance wrote:
         | > Does "blog post learning" ever really work?
         | 
         | It depends on what the intention is. Is it to promote
         | comprehensive understanding, proof, and the ability to then
         | apply the concept in all situations? Then no.
         | 
         | Is it just to point out something interesting a make a point?
         | Sure. There's nothing wrong with that. It's OK to explore
         | aspects of a subject without a dense canonical treatise.
         | There's a time and place for all sorts of explanations.
        
         | jfarmer wrote:
         | I do think "blog post learning" can work, but agree with you
         | that this ain't it. The "explanation" on the blog post would
         | really only work for someone who more-or-less understood
         | Fourier Series already.
         | 
         | You're also right that explanations can't be isolated. I'd go
         | further and say they never are. People only come to understand
         | new things in terms of things they already understand. There's
         | no "explaining X" in isolation, there's only "explaining X in
         | terms of Y." A good explanation either supplies a suitable Y or
         | makes it clear to the reader what Y they'll be relying on.
         | 
         | So, even if the author tries to isolate, a successful
         | explanation always relies on collateral information. The only
         | question is whether that collateral information is left to
         | chance or intentionally exploited.
         | 
         | What's the "Y" here? Well, the author assumes the reader can
         | make sense of phrases like "energy at a frequency", what it
         | means to "spin your frequency around a circle", "average a
         | bunch of points along a path", etc.
         | 
         | That's...a lot. I'd go so far as to say that a reader who can
         | make sense of that probably already understands more
         | fundamental concepts like vector spaces, bases, and Taylor
         | Series.
         | 
         | And if they understand those things then there are much more
         | straightforward explanations available!
        
       | hasmanean wrote:
       | The DFT is the correlation of a signal with a bunch of regularly
       | spaced apart single-frequency vectors.
       | 
       | The fft is what you get when applying dynamic programming
       | (memoization and recursive subproblem division) to the DFT.
        
       | dghughes wrote:
       | Many here probably have seen The Engineer Guy's series of videos
       | showing a Fourier Analysis machine.
       | 
       | If not here it is it's a great watch.
       | 
       | https://m.youtube.com/watch?v=NAsM30MAHLg
        
       | MontyCarloHall wrote:
       | 3Blue1Brown video that actually animates this:
       | https://youtu.be/spUNpyF58BY
        
         | highdeserthackr wrote:
         | This is very cool.
        
       | blippage wrote:
       | Here's my explanation: think of linear regression. Now ask
       | yourself, what if I applied the idea of correlation/regression to
       | a sine wave. An (audio) signal can be broken down into an
       | addition of sine waves having different frequencies and phases.
       | That, conceptually, is what we're trying to do with the Fourier
       | Transform. What frequencies is the sound wave composed of, and of
       | what amplitude. Phase angle is a consideration, but we are less
       | interested in that.
       | 
       | OK, that's a paragraph instead of a sentence, but it does
       | motivate the whole raison d'etre of "why would I care anyway?"
        
       | ear7h wrote:
       | Kind similar is this demonstration on a record player:
       | 
       | https://youtu.be/mRi23ueU7Zk
        
       | ur-whale wrote:
       | For me, the most intuitive explanation of the Fourier transform
       | is geometric and is as follows:
       | 
       | If:                   . you have a vector space equipped with a
       | dot product - say R3         . you have a  specific orthonormal
       | basis of said vector space - say (i, j, k) in R3          . you
       | have a vector V - say V [?] R3         . you want the coordinates
       | of V in that basis         . then all you have to do is take the
       | dot product of V with each member of the basis (i.e. you
       | *project* V onto each basis member).         . in other words: V
       | = Sum_over_basis_members[ individual_basis_member *
       | Dot_product[V, individual_basis_member] ]
       | 
       | And magically, that's all the various Fourier transforms are:
       | . the discrete Fourier transform (both signal and basis are
       | discrete and finite, dot product is the usual suspect)        .
       | the Fourier series (basis is discrete and infinite, signal is a
       | continuous periodic function, dot product is the continuous
       | version of the standard dot product, i.e. an integral)        .
       | the Fourier transform (basis is infinite and continuous, signal
       | is any "well-behaved" function, and the reconstruction of the
       | "vector" involves a double integral, one for the dot product and
       | one for the summation over basis terms)
       | 
       | For example, for the Fourier series:                   . The
       | "Vector" is any well-behaved periodic function (don't recall
       | exactly what well-behaved entails, but not the most important
       | part of the story anyways)         . The vector space is the set
       | of well-behaved periodic functions         . The basis is the
       | family of functions Exp[2*p*i] where the parameter i = {0, 1, 2,
       | 3, ..., infty}         . The dot product is the cross-correlation
       | between the function and a basis member: i.e. Integral[x=0,
       | x=2*p, function*basis_member]         . You rebuild the function
       | as:                         function = Sum_over_basis_members[
       | basis_member Dot_product[function, basis_member] ]
       | 
       | And that's it.
        
       | necroforest wrote:
       | I got it by first thinking about the _inverse_ Fourier transform,
       | which is just saying that I want to write a signal as a weighted
       | sum of sinusoids:
       | 
       | `x(t) = sum_k w_k sin(2 _pi_ k*t)`
       | 
       | Then you just ask the question: given x, solve for w.
       | 
       | The next question is "why sinusoids", and the answer is that
       | because for any linear, time invariant system acting on x:
       | 
       | y(t) = F[x](t)
       | 
       | that system diagonalizes over (complex) sinusoids:
       | 
       | y(f) = F(f)x(f)
        
       | sebzim4500 wrote:
       | This reminds me of an old joke in the Haskell community, where
       | people who struggled to understand Monads would finally get it
       | after a while, and would assume that whatever the last sentence
       | they heard was the only necessary one for the explanation.
        
         | bobowzki wrote:
         | Haha! That's a great joke! It can definitely be applied to a
         | lot of situations.
        
         | wholinator2 wrote:
         | What do you mean? A Monad is just a Monoid in the category of
         | Endofunctors.
        
         | dlandau wrote:
         | That was my experience studying statistical physics in Uni. For
         | "some reason" the fourth book I read was the "only one" written
         | in a way that made sense. No relation to the order I read them
         | in, just written better.
        
         | Rayhem wrote:
         | This is very much called out in the Monad Burrito Tutorial
         | Fallacy[1].
         | 
         | [1]: https://byorgey.wordpress.com/2009/01/12/abstraction-
         | intuiti...
        
           | blippage wrote:
           | One of the best explanations of monads I saw was actually
           | using Python.
        
             | monsieurbanana wrote:
             | That sounds interesting. I might search for that python
             | explanation, have an eureka moment, then forget most of it
             | in 24 hours. It's happened a couple of times with monads.
        
           | moffkalast wrote:
           | May be best to take a page from quantum mechanics and say "if
           | you think you understand monads, you don't understand
           | monads".
        
         | mti wrote:
         | In that spirit, my pet "a monad is a monoid in the category of
         | endofunctors, duh"-style one-line explanation of the Fourier
         | transform is that it's just the decomposition in the common
         | (Hilbert) eigenbasis for all translation operators. It makes it
         | surprisingly clear (to some) why it is a both natural and
         | important construction.
        
           | franknstein wrote:
           | Same, but it's only natural after studying inner product
           | vector spaces. Also being comfortable with some calculus is
           | needed to be able to overlook the technicalities of this
           | construction and focus on the actual idea.
        
         | jn5 wrote:
         | If you lose something, you always find it in the last place you
         | search... because why would you keep searching after you found
         | it
        
           | dec0dedab0de wrote:
           | It took me an embarrassingly long time to realize that it was
           | a joke when people said "It's always the last place you
           | look." Like well into my teens. But ever since I figured it
           | out, I always look at least one more place after finding
           | something.
        
             | quercusa wrote:
             | I think that there are many people who don't recognize it
             | as a joke and pass it on as great wisdom. I was also pretty
             | old when I realized the tautology.
        
               | layer8 wrote:
               | A tautology can still provide an insight by recognizing
               | that two things are really just the same. In that sense I
               | didn't perceive it as a joke, even though there's of
               | course some humor attached to it.
        
               | wrycoder wrote:
               | If you're disorganized, you'll search in random places
               | until you find it. Joke applies.
               | 
               | But if you are organized, you'll start with the most
               | likely place and progress to increasingly less likely
               | places. When you find it, there's no surprise, and no one
               | gets much of a chuckle over your efforts.
               | 
               | If you don't understand the problem space, then saying
               | "That's the last place I would have looked!" is an
               | expression of exasperation about your lack of knowledge.
        
             | selimthegrim wrote:
             | Last is semantically ambiguous here, you're not wrong,
             | could also mean "last place (most unobvious) you would
             | think to look" as well
        
               | narag wrote:
               | Because it it were obvious, you had already found it.
        
             | ithinkso wrote:
             | Well, at least you were a teen when you realized it. In my
             | case it was 1min ago in this very HN thread...
             | 
             | I always assumed it was kind of a 'life's a bitch' aphorism
        
             | TedDoesntTalk wrote:
             | Do you find it again?
        
         | [deleted]
        
         | wkjagt wrote:
         | I've understood monads multiple times in my life, but each time
         | that understanding was so fragile that it crumbled when I tried
         | explaining to someone else.
         | 
         | I'm currently in a phase where I don't understand them.
        
           | posix86 wrote:
           | This is 100% how I feel too haha. And everytime I loose the
           | feeling of understanding them, I question if I understood
           | them before.
        
           | orwin wrote:
           | My crumbling understanding is tied to my linear algebra also
           | crumbling understanding, anf hopefully continuing reading
           | this thread will solidify it.
        
           | mannerheim wrote:
           | I think it's one of those things you just have to understand
           | through practice.
        
           | becquerel wrote:
           | Understanding monads isn't particularly useful unless you
           | also understand functors, which are a relatively much easier
           | concept to understand long-term. They're really just functors
           | with a few extra rules that make them a bit more useful.
        
           | johnday wrote:
           | A monad is a computational context, where the nature of that
           | context is determined by two things: the shape of the data
           | structure corresponding to it, and the definition of (>>=)
           | which handles sequencing of two computations in that context.
           | 
           | Anything more specific than that should be handled case-by-
           | case until you build an intuition for how any given monad
           | will behave.
        
             | TedDoesntTalk wrote:
             | Is computational context another way of saying scope?
        
               | trenchgun wrote:
               | Not exactly.
               | 
               | Computational context in this sense is like, is this
               | computation of the type that can either fail or succeed?
               | (Maybe monad). Or is this computation of the type that
               | can produce multiple values of the same type? (List
               | monad). Or maybe this computation can produce value of
               | one type or another (Either monad). Or a computation that
               | can interacti with inputs and outputs (IO monad).
               | 
               | So these are computations in differend kinds of
               | computational "worlds" in a sense, with different rules
               | on how such a computation in this context composes.
        
               | hackinthebochs wrote:
               | Let me rephrase it in words that I have more intuitive
               | grasp of and see if the translation holds. A monad is a
               | way to pick out the specific constraints of interest and
               | define a class of computations that satisfy those
               | constraints, and how computations within this class
               | compose.
               | 
               | Are the constraints limited to return values or can there
               | be other kinds of constraints?
        
               | [deleted]
        
               | [deleted]
        
         | belter wrote:
         | I always though the joke was, that anybody who finally
         | understands what a Monad is, at that precise moment,
         | automatically loses the ability to explain it to others...
        
         | hprotagonist wrote:
         | https://byorgey.wordpress.com/2009/01/12/abstraction-intuiti...
         | 
         | well, it's so simple: they're just burritos!
        
       | two_handfuls wrote:
       | It's useful to test explanations on people unfamiliar with it. So
       | in this spirit, I want to share: the sentence didn't explain it
       | to me.
       | 
       | After watching3Blue1Brown's video, I get it. So I can share with
       | you what confused me:
       | 
       | It's this part of the sentence: "average a bunch of points along
       | that path". I know what averaging it, but I don't know what
       | averaging "along a path" means. Also I was (wrongly) expecting
       | the output of this function to be a real number - I just expect
       | "energy" to be a number. Instead, here what we're doing is
       | averaging complex numbers so the output is a complex number. What
       | a complex energy means is left as an exercise to the reader.
       | 
       | Here's my suggestion for a sentence that would have worked better
       | for me:
       | 
       | "To find the energy at a particular frequency, spin your signal
       | around a circle at that frequency, and find the center of mass of
       | the plot."
       | 
       | Presumably then we have to take the distance from the origin to
       | get an amount of energy.
        
         | julianeon wrote:
         | Thank you, this is the epitome of a helpful comment and it
         | helped me.
        
         | xedrac wrote:
         | 3Blue1Brown's video is the best explanation I've ever seen.
        
       | based2 wrote:
       | src: https://betterexplained.com/articles/colorized-math-
       | equation...
        
       | Moodles wrote:
       | The best explanation I've ever seen of the Fourier transform is
       | from 3Blue1Brown: https://m.youtube.com/watch?v=spUNpyF58BY&vl=en
        
         | rectang wrote:
         | The key insight for me on this topic also came from
         | 3Blue1Brown, but in a different video: it's that e^x is NOT
         | best thought of as repeated multiplication, but instead as the
         | function exp(x) = 1 + x + x^2/2 + X^3/6 + x^4/24 + ...
         | 
         | After being relieved of the burden of that misconception, I was
         | finally able to understand the role of complex numbers in the
         | Fourier Transform.
         | 
         | https://www.youtube.com/watch?v=ZxYOEwM6Wbk&t=439s
        
           | fortran77 wrote:
           | Or even more simply, if you know that e^x on the complex
           | plane rotates you around the origin.
        
             | rectang wrote:
             | People told me that over and over but it didn't help --
             | because it didn't make sense why repeated multiplication
             | would cause rotation!
             | 
             | Later in that video, we see a visualization of the
             | rotation. I was able to grasp how the exp function could
             | yield rotation where I'd never been able to understand why
             | e*e*e*e... did.
             | 
             | https://www.youtube.com/watch?v=ZxYOEwM6Wbk&t=2178s
        
               | nuancebydefault wrote:
               | I think you need to study the Euler equation to
               | understand the relationship between goniometric functions
               | and exponential functions when calculating with complex
               | numbers. One easy to remember formula that connects those
               | functions.
        
               | rectang wrote:
               | Indeed. The title of that 3Blue1Brown video is, "What is
               | Euler's Formula actually saying?"
               | 
               | That "easy to remember" formula had been presented to me
               | countless times. It only made sense once I stopped
               | thinking about it in terms of repeated multiplication.
        
               | l- wrote:
               | The repeated "many folding", which would be better
               | visualized as tendition, exponentiation ("tendaddition")
               | pattern also breaks with fractions & at the most basic
               | negative numbers:
               | https://www.youtube.com/watch?v=mvmuCPvRoWQ&t=922s Also:
               | https://news.ycombinator.com/item?id=28524792 "log base"
               | would be better named untendaddition similarly division -
               | does not necessary "separate" - untendition & unaddition
               | - does not "draw under". Etymologos to use "given"
               | symbols over the datum names.
        
               | l- wrote:
               | Further, the pattern matched "grouping" definitions
               | (https://news.ycombinator.com/item?id=25278021) names
               | have a better correspondence to the Greek scheme:
               | diation, triation, tetration... while the "Group theory"
               | definitions scheme lending to various geometries
               | (hypercomplex: complex::hyperbolic, split-
               | complex::elliptic, dual::parabolic) would match some
               | other naming scheme.
        
       | ourmandave wrote:
       | Veritasium has an explainer of the Fast Fourier Transform on
       | youtube.
       | 
       | From how it's used and an interest history.
       | 
       | https://www.youtube.com/watch?v=nmgFG7PUHfo
        
       | tel wrote:
       | Honestly, it's not so bad. It's easy to pick any such attempt
       | apart. This is close to my favorite pithy way of explaining it,
       | too, which is to break it down component-wise using the idea of
       | filter banks. It's not a single sentence, but here's what I tend
       | to say:
       | 
       | Any signal--like sounds or electrical signals, or even images--
       | can be thought of as having a certain amount of 'energy' at any
       | choice of frequency. This makes the most sense in music where we
       | might thing of a 3-note chord as having 3 distinct packets of
       | energy at 3 different frequencies.
       | 
       | For any given frequency, we can compute the amount of energy a
       | signal contains at that frequency by comparing the signal with a
       | test signal, a "pure tone" at that frequency. Pure tones are
       | signals that have the unique property of putting _all_ of the
       | energy at exactly one frequency. The Fourier Transform is an
       | equation which packages this idea up, showing us how to represent
       | all of these measurements of energy at all frequencies.
       | 
       | The natural idea of a "pure tone" might be a sine wave. This is
       | what we think of when we think of a musical pure tone and it
       | certainly exists only at a single frequency. But sine waves make
       | for bad comparison signals due to the problem of "phase": two
       | sine waves played together can perfectly support one another and
       | become twice as loud, or they can perfectly interrupt one another
       | and become silence. This happens because sine waves oscillate
       | between positive values and negative values and positive things
       | can cancel out negative things.
       | 
       | When you look at the equation for the Fourier transform you'll
       | see an exponent of a complex number. This is an improved version
       | of a 'pure tone' which avoids the phasing issues of a sine wave.
       | It does this by spinning like a clock hand in two dimensions,
       | remaining always at the same length. This extra dimension lets us
       | preserve enough information so that things never cancel out like
       | with sine waves.
        
         | teolandon wrote:
         | This is good, but I think it's cyclical (heh). We want to
         | compare our signal with a "pure tone". What's a pure tone? A
         | sine wave. Why is a sine wave a pure tone? Because when we
         | compare it with a pure tone, it's identical.
        
           | lathyrus_long wrote:
           | That's a question of why Fourier transforms are important
           | though, not just how they're defined and computed. The next-
           | level answer is presumably that sinusoids (or complex
           | exponentials in general) are the eigenfunctions of general
           | linear time-invariant systems, i.e. that if the input to an
           | LTI system is exp(j*w*t), then its output will be
           | A*exp(j*w*t) for some complex constant A. Some other comments
           | here already alluded to that, noting that sinusoids are good
           | for solving linear differential equations (which are LTI
           | systems), or that the sum of two sinusoids of the same
           | frequency shifted in time (which is an LTI operation, since
           | addition and time shift are both LTI) is another sinusoid of
           | that same frequency.
           | 
           | LTI systems closely model many practical systems, including
           | the tuning forks and flutes that give our intuition of what a
           | "pure tone" means. I guess there's a level after that noting
           | that conservation laws lead to LTI systems. I guess there's
           | further levels too, but I'm not a physicist.
           | 
           | That eigenfunction property means we can describe the
           | response of any LTI system to a sinusoid (again, complex
           | exponential in general) at a given frequency by a single
           | complex scalar, whose magnitude represents a gain and whose
           | phase represents a phase shift. No other set of basis
           | functions has this property, thus the special importance of a
           | Fourier transform.
           | 
           | We could write a perfectly meaningful transform using any set
           | of basis functions, not just sinusoids (and e.g. the graphics
           | people often do, and call them wavelets). But if we place one
           | of those non-sinusoidal basis functions at the input of an
           | LTI system, then the output will in general be the sum of
           | infinitely many different basis functions, not describable by
           | any finite number of scalars. This makes those non-sinusoidal
           | basis functions much less useful in modeling LTI systems.
        
             | tel wrote:
             | Hahah, I do try not to dive into eigenfunctions when
             | talking about this, but it is _so_ compelling. Love seeing
             | that exposition :)
        
           | tel wrote:
           | I appreciate the feedback. I'm person I talk about pure tones
           | more, mostly because I love how pretty complex spirals are.
           | 
           | Here, I just tried to define them "Pure tones are signals
           | that have the unique property of putting all of the energy at
           | exactly one frequency" and then use sine as an example.
           | 
           | Truly, complex rotations are a much better definition, but
           | it's somewhat reasonable to expect people to be familiar with
           | sine waves.
        
         | sam_lowry_ wrote:
         | I like your explanation in that it gives reason to the
         | rotation.
        
         | stevebmark wrote:
         | As someone who still doesn't understand the Fourier Transform,
         | this explanation doesn't help, nor does the article's. As a
         | complete noob, your explanation shows off what you know but
         | doesn't help newcomers learn it.
        
         | glitchc wrote:
         | That's better than the blog post actually (ECE background
         | here).
        
         | [deleted]
        
       | [deleted]
        
       | pid-1 wrote:
       | I'm found of seeing the Fourier transform as a fancy way of
       | changing basis / coordinates, not sure how mathematically correct
       | is that.
       | 
       | In high school physics some problems got way easier by redefining
       | coordinates as x' and y', solving for those, them going back to x
       | and y. That's what the Fourier transform does, but for functions.
       | 
       | Looking at its formula, we can see it looks like we are
       | projecting a function into a series of complex exponentials, just
       | like we need to project a vector in x' and y' for a change of
       | basis in the euclidian space.
       | 
       | Then, the Fourier coefficients can be thought as "how much my
       | function looks like a complex exponential of this given
       | frequency". Compare the formulas for the Pearson Correlation and
       | Convolution, they are almost the same.
       | 
       | Since the Fourier transform is just a particular case of changing
       | basis, there are many, many other integral transforms that might
       | be useful in different contexts. We can even create our own. But
       | unis teach about the Fourier transform because it is really good
       | for solving differential equations, which turns out to be a
       | popular way of modeling problems in engineering.
        
         | dimatura wrote:
         | Yes, that totally makes sense - you can think of a function as
         | an "infinite-dimensional" vector, which you can express in the
         | (orthogonal) basis of a bunch of sines and cosines of different
         | frequencies.
        
         | necroforest wrote:
         | >I'm found of seeing the Fourier transform as a fancy way of
         | changing basis / coordinates, not sure how mathematically
         | correct is that.
         | 
         | It's exactly a change of basis
        
         | walnutclosefarm wrote:
         | It's a mathematically correct way of looking at it as long as
         | you're clear that that the bases are functions, and you're
         | considering how a particular mathematical artifact (the signal
         | function) is represented as a sum of basis functions. In the
         | naive signal trace, the basis functions are one-hot functions
         | (zero everywhere except at a single point, where they have
         | value 1), in the Fourier basis, the basis functions are sine
         | waves of amplitude 1. So the signal is the sum of the basis
         | functions multiplied by a unique amplitude for each basis
         | function.
         | 
         | Sine waves aren't the only basis with which you can make this
         | transformation. Your basis can be an arbitrary set of periodic
         | functions as long as it meets certain requirements.
         | Decomposition into wavelet functions is commonly used in
         | seismic signal analysis, for example.
        
         | lathyrus_long wrote:
         | As others note, it's exactly a change in basis. In particular,
         | it's an orthonormal basis, meaning that the dot product
         | (correlation) between a basis function and itself is one, and
         | between any two different basis functions is zero. This gives
         | some intuition for why the forward and inverse transforms look
         | so similar, in the same way that the inverse of an orthonormal
         | matrix (like a rotation matrix) is just its transpose.
        
       | munchler wrote:
       | I'm a layman, but I always think of the Fourier Transform as "an
       | algorithm that converts amplitude over time into frequency
       | intensities". I guess that's more of a What than a How, but it
       | still seems good enough for a single sentence.
        
         | thrdbndndn wrote:
         | It's close enough.
         | 
         | But then, how is it different from Laplace transform?
         | 
         | (I have to admit, I actually learned about both during my uni
         | time. But now I totally forgot them all.)
        
           | nuancebydefault wrote:
           | Laplace is a generalisation of Fourier. Fourier can in
           | principle only represent periodic signals. To circumvent this
           | limitation, they invented the windowing of consecuive parts
           | of signals and do the transform on each window. Laplace adds
           | the transient part, that's what makes it more complicated.
        
           | bee_rider wrote:
           | Laplace transform is for when you are going to do something
           | professory or otherwise tricky, Fourier transform is for when
           | you are going to do something engineery. Fast Fourier
           | transform is for when you are going to do something so
           | engineery it becomes tricky again.
        
           | linhns wrote:
           | I'm currently in uni, learnt about both but has already
           | forgotten what the Laplace transform is about, but still
           | remember Fourier vividly as I got to play with it a lot.
           | 
           | If you read more into it's history, the DFT actually helped a
           | lot during the nuclear arms race as it can detect any nuclear
           | test that happened anywhere except underground.
        
           | Jensson wrote:
           | Laplace transform is for exponentials, fourier is for the
           | sine function (ie a complex exponential). So they do the same
           | thing just with different weight functions.
           | 
           | All of this is just linear algebra, with different linear
           | transforms. A fourier transform is a linear transform that
           | transforms point space basis to harmonic oscillation basis.
        
           | [deleted]
        
         | [deleted]
        
       | Bost wrote:
       | That color coding is nice. Not just as a reference to the parts
       | of the math formula, but also within the sentence itself.
        
         | stephendause wrote:
         | Agreed. It's a nice, succinct way of tying the natural language
         | to the components of the formula. I don't know how well it
         | would work in longer form content, but as a one-off or perhaps
         | occasional technique, I think it works really well.
        
       | qez wrote:
       | As someone without a lot of math background, the most confusing
       | part of this is k. Is k a number, like 7.45 or whatever? Or
       | something else? Everything else is just normal mathy stuff.
        
         | torginus wrote:
         | k is the parameter of the function as in the 'x'-axis so
         | instead of saying y=f(x), the formula says y=f(k)
         | 
         | You can see in the formula, that it's plugged in as the
         | frequency of the signal the original signal is 'spun-around'
         | with.
        
       ___________________________________________________________________
       (page generated 2023-01-15 23:00 UTC)