[HN Gopher] GPT-2 and the Nature of Intelligence
       ___________________________________________________________________
        
       GPT-2 and the Nature of Intelligence
        
       Author : stenlix
       Score  : 21 points
       Date   : 2020-01-25 20:12 UTC (2 hours ago)
        
 (HTM) web link (thegradient.pub)
 (TXT) w3m dump (thegradient.pub)
        
       | wilg wrote:
       | I don't agree with the conclusion here. It's all about the input
       | data.
       | 
       | GPT-2 is trained on words people actually write on the internet,
       | which is an inherently incomplete dataset. It leaves out all the
       | other information an "intelligence" knows about the world. We
       | know what sources are authoritative, we know the context of words
       | from the visual appearance of the page, and we connect it all
       | with data from our past experiences, school, work, friends, our
       | interaction with the world.
       | 
       | How would GPT-2 determine most facts from the input dataset? If
       | the only thing you knew was all the text on the internet, with
       | zero other context, you'd have no way of knowing what is "true",
       | or why that concept is important, or anything else. I bet you'd
       | behave just like GPT-2.
       | 
       | It's a robot that is really good at writing, because that is all
       | it knows. I think it doesn't know anything about how to make
       | sense on a macro scale because I don't think the input data
       | contains that information. It seems to do well when the input
       | data contains relevant information.
        
       | gog-ma-gog wrote:
       | https://nostalgebraist.tumblr.com/post/189965935059/human-ps...
       | for an orthogonal point of view---I feel Marcus is a bit too
       | embroiled in this particular debate to make level-headed
       | criticism on the merits/potential of GPT-2
        
       | zanek wrote:
       | I completely agree with Marcus' assessment of GPT-2 and its ilk.
       | They are simply regurgitating words with zero understanding of
       | any words/meaning.
       | 
       | It seems that OpenAi and others are peddling this AI when its
       | simply a glorified Eliza on steroids.
        
       | the8472 wrote:
       | > Literally billions of dollars have been invested in building
       | systems like GPT-2, and _megawatts of energy_ (perhaps more) have
       | gone into testing them
       | 
       | Huh, seems like the bot that produced the article lacks some
       | understanding about the real world. Maybe it just needs more
       | training until it learns to associate megawatts with power
       | instead of energy.
       | 
       | Meanwhile GPT2 completes this sentence to
       | 
       | > Literally billions of dollars have been invested in building
       | systems like GPT-2, and megawatts of power generation to support
       | this project.
        
       ___________________________________________________________________
       (page generated 2020-01-25 23:00 UTC)