[HN Gopher] Professor's perceptron paved the way for AI 60 years...
       ___________________________________________________________________
        
       Professor's perceptron paved the way for AI 60 years too soon
       (2019)
        
       Author : ilamont
       Score  : 76 points
       Date   : 2022-02-06 19:36 UTC (3 hours ago)
        
 (HTM) web link (news.cornell.edu)
 (TXT) w3m dump (news.cornell.edu)
        
       | tomcat27 wrote:
       | Everyone claims the iron throne
        
       | canjobear wrote:
       | The story of Perceptrons---both the idea and the book of that
       | title---is instructive about how science proceeds in practice.
       | The folklore is that this book killed neural net research, but if
       | you read it you'll find it's not as damning as you might expect.
       | Apparently it circulated widely as a manuscript before
       | publication, and the circulating manuscript was much more
       | negative in tone, and this is what shaped people's perceptions.
        
         | unfocussed_mike wrote:
         | Yea -- I was taught about this in my first year Cybernetics
         | course in the 1990s; the idea that there was an AI crisis is
         | overcooked but it definitely tipped the entire industry towards
         | expert systems.
        
         | kd5bjo wrote:
         | Multilayer neural networks weren't really a viable tool until
         | the backpropagation algorithm for determining internal
         | parameters was developed in 1985 (cf
         | https://apps.dtic.mil/sti/pdfs/ADA164453.pdf )
        
           | enchiridion wrote:
           | What really made the difference was non-linear activations.
           | 
           | Without a non-linearity depth doesn't buy you anything.
        
           | _0ffh wrote:
           | I think you'll find that backpropagation was essentially
           | developed (multiple times) during the 60s in the field of
           | control theory and first implemented in the early 70s.
           | 
           | Ed: To be clear, the idea to use them to adapt the weights of
           | NNs was also from the 70s but only rediscovered and applied
           | to MLPs by at least two independent groups/individuals in the
           | 80s.
        
       | 6gvONxR4sf7o wrote:
       | How can an article like this talk about the significance of a
       | single layer perceptron and not talk about statistics's
       | contributions, like regression models? Binary classification with
       | a single layer perceptron paved the way, but logistic regression
       | isn't worth mentioning?
        
       | Jun8 wrote:
       | The answers here may be of interest:
       | https://ai.stackexchange.com/questions/1288/did-minsky-and-p...
       | 
       | The unfortunate thing was not the _Perceptrons_ book but the fact
       | that Rosenblatt died prematurely soon after. He was very well-
       | equipped to defend and carry on work on NNs, I think.
        
         | neonate wrote:
         | https://web.archive.org/web/20201001082122/https://ai.stacke...
        
       | inetsee wrote:
       | I remember taking a class at UCSD in 1971 or 1972 that touched on
       | Perceptrons (among a lot of other things, it was basically a
       | survey course). I wonder sometimes what the world would be like
       | today if they had realized the importance of hidden layers back
       | then.
        
         | rococode wrote:
         | Data and compute power in the 70s probably would've limited the
         | usefulness and led people to try other things (perhaps not even
         | "probably" - that could be what really happened), though maybe
         | it could've been revisited with success by the late 90s.
         | 
         | Makes you wonder if there are other abandoned techniques that
         | might be worth circling back to nowadays...
        
       | pfisherman wrote:
       | I think it often goes understated just how much the emergence of
       | massive datasets led to the successes of neural networks.
       | 
       | I once heard Daphne Koller say that before big data, neural
       | networks were always the second best way to do anything.
        
         | 6gvONxR4sf7o wrote:
         | Agreed. There's a question of ownership in that too. It's
         | frequently brought up in regards to Copilot. If companies had
         | had to pay to license all the photos or code or writing they
         | train on, a lot of these datasets wouldn't exist, and then
         | neither would the models.
         | 
         | Which is why I wish there was a copyleft open source data
         | analogue. If you train on everyone's public data, your model
         | should have to be just as publicly available.
        
         | bigcat123 wrote:
        
       ___________________________________________________________________
       (page generated 2022-02-06 23:00 UTC)