[HN Gopher] Why not faster computation via evolution and diffrac...
       ___________________________________________________________________
        
       Why not faster computation via evolution and diffracted light?
        
       Author : tobr
       Score  : 50 points
       Date   : 2021-04-24 19:10 UTC (3 hours ago)
        
 (HTM) web link (interconnected.org)
 (TXT) w3m dump (interconnected.org)
        
       | deckar01 wrote:
       | I think the part they are missing is that a general purpose
       | computer is required to train and simulate the physical model.
       | The speed at which a trained model executes isn't really a
       | problem in most domains as far as I'm aware. They can evaluate in
       | real time already on standard hardware. It's also more efficient
       | to push software updates when your model improves rather than
       | retooling you manufacturing process and deprecating valuable
       | resources into a landfill.
        
       | SubiculumCode wrote:
       | Speed and efficiency are often lost to gain flexibility, and this
       | tradeoff is often ignored when you see articles extolling non-
       | human intelligence of birds, insects, etc. Insects, through
       | evolution have created highly efficient and FAST neural circuits
       | that lead to fit behaviors, but the fitness of these circuits and
       | behaviors often mask their inflexibility to a change of context.
       | Change the context and the lightning fast reflexes of a fly can
       | be made to make it unfit.
        
       | ganafagol wrote:
       | I have no idea what the author is trying to say.
       | 
       | Given that they "just recently" learned about microcode, I'd not
       | hold my breath expecting a profound insight about computer
       | architecture or anything computer science though.
        
         | tyingq wrote:
         | Seems to be asking for some sort of optical FPGA thing with the
         | hope that light will be immensely faster than electricity.
         | Maybe not knowing that it's not?
         | 
         | Also some hints at wanting this "optical FPGA" thing to be
         | analog, rather than digital. And some notion that the various
         | layers of abstraction (silicon->microcode->asm->code) must be
         | wasting lots of cycles.
         | 
         | I'm not really clear on where he thinks 40 years of instant
         | fast-forward might come from.
        
           | eecc wrote:
           | > Seems to be asking for some sort of optical FPGA thing with
           | the hope that light will be immensely faster than
           | electricity. Maybe not knowing that it's not?
           | 
           | But really it is. Except for interconnects or analog RF
           | stages, ICs aren't built as if they were solid states
           | waveguides. Indeed speed is determined by how powerful the
           | charge pumps are in moving electrons, not quite the speed of
           | EM propagation
        
             | tyingq wrote:
             | He's hoping for "a million times faster", and "that it
             | might be possible, with today's technology".
        
         | karmakaze wrote:
         | On the one hand:
         | 
         | > Could that task be performed by simply the right set of
         | transistors, at the hardware level, no matter how insanely
         | arranged?
         | 
         | > What shortcuts could be taken?
         | 
         | Seems to be suggesting ASICs as is done with Bitcoin mining. On
         | the other about optical rather than electrical which could be
         | more efficient, i.e. produce less heat.
         | 
         | I was expecting a completely different post about evolution of
         | brains that operate with photons, and another jump to entangled
         | ones.
        
           | e9 wrote:
           | but that's basically already solved with FPGA, right? You can
           | create any circuit on the fly with code.
        
         | rini17 wrote:
         | They propose to implement an algorithm such as a trained neural
         | network by an single-purpose analog computer (whether
         | electronic or optical).
         | 
         | This is sound idea and subject to research but he himself lists
         | some disadvantages and I can imagine there are many more.
         | Generally, nonlinear analog systems are VERY fickle.
        
       | [deleted]
        
       | claudiojulio wrote:
       | One day someone dreamed that he could fly like birds. Today we
       | fly much higher and faster than them. Scientific achievement
       | always begins with the imagination.
        
       | cycomanic wrote:
       | The whole space of analogue computing (circuits) did get a big
       | boost in recent years (people had been doing it for years, but it
       | never amounted to anything), largely because of the requirements
       | of deep learning, which is essentially large matrix
       | multiplications. I guess many of the recent ML accelerators fall
       | under this category in some way.
       | 
       | There's also been quite a bit of work on using optics (mainly
       | integrated optics) for these tasks, but it's still very open if
       | that will amount to anything.
       | 
       | All that said, I really would be hesitant to call this computing
       | in the traditional sense, it's more like an accelerator card if
       | anything.
        
       | pjc50 wrote:
       | > Let's say you just wanted to perform just one task. Say,
       | recognise a face. Or know whether a number is prime or not. And
       | you didn't care about flexibility at all.
       | 
       | > Could that task be performed by simply the right set of
       | transistors, at the hardware level, no matter how insanely
       | arranged?
       | 
       | For quite a lot of things .. sort of yes. You can do functional
       | reduction. Then you get a chip that does exactly one thing. Which
       | works great until you want to change it. Since change flexibility
       | is incredibly important, that's why we've gone the opposite
       | direction and use microcontrollers for things that could be done
       | with ASICs.
       | 
       | He misinterprets Thompson though; the strange physical effects of
       | the evolved circuits aren't "hidden", they're just _outside the
       | model_. We rely on simplified models to make behavior
       | predictable, and we design circuits to be as modellable as
       | possible and _not use the properties that are hard to model_.
        
       | daralthus wrote:
       | I don't get the negativity. This is indeed an emerging field, so
       | just going to throw in a few examples:
       | 
       | - Google using RL for chip layout optimization [1]
       | 
       | - Stanford profs optimizing lenses end to end with image
       | processing [2]
       | 
       | As for the "diffractive deep neural network" paper it is not
       | really a deep network as they don't implement the non-linearities
       | in the optical domain. That still is the hard bit, so maybe the
       | monocle is not realistic, but fortunately there are ways it can
       | still speed up computation.
       | 
       | There is a general consensus that GPU-s enabled the current deep
       | learning revolution and perhaps as argued by the Hardware Lottery
       | paper [3] backprop is really just the current lucky ticket. So
       | why not try to evolve hardware with ML for other algorithms too?
       | 
       | As for the questions raised, I do want to know your answer to
       | this:
       | 
       | > a question for computer scientists, what single question would
       | you ask if you have a dedicated computer that was [that
       | multiplier] faster?
       | 
       | [1] https://ai.googleblog.com/2020/04/chip-design-with-deep-
       | rein...
       | 
       | [2] https://www.youtube.com/watch?v=iJdsxXOfqvw
       | 
       | [3] https://hardwarelottery.github.io
        
         | tyingq wrote:
         | >I don't get the negativity.
         | 
         | I suppose my response looks negative. What drove that was that
         | the writeup seemed to be "high confidence in something very
         | unusual[1]" coupled with "very little detail on what he thinks
         | would accomplish that[2]".
         | 
         | [1] "Million times faster" "with today's technology" "40 years
         | of performance gain"
         | 
         | [2] I gathered only analog vs digital, light instead of
         | electricity, and some sort of analog/optical FPGA.
         | 
         | I felt negative only in the sense that the confidence seemed
         | very high, but I couldn't see any detail that seemed to support
         | it, or enough detail to search elsewhere. Especially with all
         | the references that we could do it now.
         | 
         | > a question for computer scientists, what single question
         | would you ask if you have a dedicated computer that was [that
         | multiplier] faster?
         | 
         | Assuming climate change is the largest threat looming in the
         | near future, I suppose better modeling and prediction on what
         | to do, and when.
        
       | benhoyt wrote:
       | I think this is inspirational but it seems like puffy popular
       | science ("pseudo-science" may or may not be too strong?). I'll
       | quote what I wrote to a friend about this article:
       | 
       | > It was intriguing and piqued my interest. However, it smells a
       | lot like hyped-up crackpot science to me ... if it seems too good
       | to be true, it probably is. There's a lot of vagueness here,
       | "what ifs", etc. Like "What if we could evolve hardware to make
       | use of hidden physics?" Yes, well, what if? What is "hidden
       | physics"? And why doesn't the author try it? :-)
       | 
       | > I looked briefly at his bio, and his background is "new media"
       | and "addressing abstract social and technological ideas to mixed
       | audiences" ... (computer) science, not so much.
       | 
       | > That said, I do like the idea of reducing abstraction layers,
       | getting closer to the hardware, and using seemingly-strange
       | physics for what it's worth. I'm just very skeptical of his
       | framing of it as something which could make everything 1000x
       | faster overnight.
       | 
       | > I remember reading a book years ago by "futurist" Michio Kaku
       | called Visions. At the time I was really inspired by his
       | "visions", which sounded very scientific. However, I think they
       | were probably just vague, well, visions. This writing strikes me
       | as similar.
        
         | wizzwizz4 wrote:
         | > _What is "hidden physics"?_
         | 
         | Physics we don't know about. Our cells are doing that as we
         | speak - though we'd probably call most of it "hidden
         | biochemistry". I think it unlikely that we'd stumble upon such
         | "hidden physics" unless we were evolving objects at close to
         | the atomic scale, but it's _possible_ ; perhaps there's some
         | way of getting phonons to interact with each other that behaves
         | like Brownian motion, and then you can get metaphonons?
         | 
         | The author doesn't try it probably because the author doesn't
         | know how. I don't try it because I suspect it's not possible
         | with the technology I have access to.
        
       ___________________________________________________________________
       (page generated 2021-04-24 23:01 UTC)