[HN Gopher] Depends upon what the meaning of the word "is" is
       ___________________________________________________________________
        
       Depends upon what the meaning of the word "is" is
        
       Author : feross
       Score  : 31 points
       Date   : 2020-08-07 05:43 UTC (17 hours ago)
        
 (HTM) web link (meaningness.com)
 (TXT) w3m dump (meaningness.com)
        
       | thelazydogsback wrote:
       | I just saw this sign yesterday and was thinking how many ways it
       | could be parsed:
       | 
       | "Big golf factory sale opening"
       | 
       | Seems like at least 8:
       | 
       | "Big (golf factory)" vs "(Big golf) factory"
       | 
       | "(factory sale) opening" vs. "factory (sale opening)"
       | 
       | "... opening:V(PresentProgressive)" vs "... opening:N"
       | 
       | Only two of which are semantically likely, and one of which is
       | pragmatically likely - unless you're really rich and in the
       | market for golf factories.
        
       | clairity wrote:
       | bill clinton, is that you?!
       | 
       | but seriously, in contrast to natural language, the article lays
       | out the basics of mathematically-based logic, not for its own
       | accord, but as groundwork for the more interesting later sections
       | talking about context-dependence and reasonableness.
       | 
       | the reason natural language has so much ambiguity is not because
       | our brains couldn't have come up with a rigorously logical
       | language, but because the world is ambiguous and language
       | reflects that.
        
         | foldr wrote:
         | >but because the world is ambiguous and language reflects that.
         | 
         | This doesn't seem like a very satisfying explanation. Take one
         | particular example of a structurally ambiguous sentence of
         | English:
         | 
         | "The company couldn't make the car fast enough".
         | 
         | The two meanings are completely distinct (speed of production
         | vs. speed of the car). There's no fuzziness about this
         | distinction out there in the world. The speed at which a car
         | travels and the speed at which it's made are two completely
         | distinct properties.
        
           | eindiran wrote:
           | That is an interesting example because it looks like semantic
           | ambiguity rather than syntactic ambiguity. But actually it is
           | about structure as you commented -- something like this:
           | 
           | [S [DP [D The [N company]]] [VP [AuxP [Aux couldn't] [V
           | make]] [DP [D the] [NP [N car] [AdjP [Adv fast] [A enough]]]]
           | 
           | vs
           | 
           | [S [DP [D The [N company]]] [V' [VP [AuxP [Aux couldn't] [V
           | make]] [DP [D the] [N car]]] [AdjP [Adv fast] [A enough]]]
           | 
           | Regarding the meat of your comment, it is quite difficult to
           | banish all ambiguity from natural language for a variety of
           | reasons, but we don't really need to: humans are incredibly
           | good at handling linguistic ambiguity. There has been a lot
           | of fascinating research on the topic: in particular, I
           | recommend reading up about anaphora resolution[0] and garden
           | path sentence repair[1], because the literature includes some
           | info on what is happening in the brain, which is
           | significantly more detailed than what exists for most other
           | types of linguistic ambiguity.
           | 
           | All of this ambiguity in natural language is something that
           | continues to be huge hurdle for NLP: it turns out that
           | fetching the right information from the context to resolve
           | all the ambiguities that arise in a single conversation is
           | completely non-trivial, despite how easy humans make it look!
           | 
           | An interesting case study in the opposite direction (ie
           | attempting to remove ambiguities from natural language) is
           | Ithkuil[2]: it is a conlang that attempted to completely
           | banish (semantic and lexical) ambiguity and it ended up being
           | ridiculously hard to use or learn at all.
           | 
           | [0] https://en.wikipedia.org/wiki/Anaphora_(linguistics)
           | 
           | [1] https://en.wikipedia.org/wiki/Garden-path_sentence
           | 
           | [2] https://en.wikipedia.org/wiki/Ithkuil
           | 
           | If anyone is curious, you can plug those trees into here
           | (http://mshang.ca/syntree/) and it will draw them for you.
           | But my tree drawing skills are very rusty, so they are pretty
           | basic/bad.
        
           | joosters wrote:
           | And yet, in the real world, people could use that sentence as
           | part of a discussion without causing any confusion or
           | ambiguity. Because sentences don't exist on their own, they
           | have a wider context which can focus their meaning.
           | 
           | Human language is succinct. We don't generally say twenty
           | words when ten would do. If the context made your example
           | sentence clear, why would a speaker need to add any words to
           | clarify it further?
        
         | RedEdward71 wrote:
         | I immediately thought of slick willy as well.
        
         | adrianmonk wrote:
         | Also time constraints. It's possible to be much more precise
         | even with informal language, but it would take forever.
         | 
         | So, for efficiency, words don't really deliver an idea to the
         | listener. Instead, it's assumed that the listener is working
         | toward the idea through their own reasoning, and words fill in
         | only the necessary gaps to help them get there or to help them
         | get there more quickly.
         | 
         | You more or less reverse engineer what their thought process
         | must be, then you do a gap analysis between what they're
         | probably thinking and what you want them to be thinking, and
         | you give them the pieces of info necessary for them to make the
         | leap.
        
       | avindroth wrote:
       | One of the most influential philosophical concepts for me from
       | the last decade was from this very blog called "Nebulosity". It
       | just speaks to the nature of reality that is misinterpreted with
       | overlays of meaning.
       | 
       | -
       | 
       | 'Nebulosity' refers to the insubstantial, amorphous, non-
       | separable, transient, ambiguous nature of meaningness.
       | 
       | From a distance, clouds can look solid; close-up they are mere
       | fog, which can even be so thin it becomes invisible when you
       | enter it.
       | 
       | Clouds often have vague boundaries and no particular shape.
       | 
       | It can be impossible to say where one cloud ends and another
       | begins; whether two bits of cloud are connected or not; or to
       | count the number of clouds in a section of the sky.
       | 
       | If you watch a cloud for a few minutes, it may change shape and
       | size, or evaporate into nothing. But it is impossible to find an
       | exact moment at which it ceases to exist.
       | 
       | It can be impossible to say even whether there is a cloud in a
       | particular place, or not.
       | 
       | [from] https://meaningness.com/nebulosity
        
       | Rumperuu wrote:
       | Related: https://en.wikipedia.org/wiki/E-Prime
        
       | nojs wrote:
       | It's interesting to think how fundamentally impossible it is to
       | parse a sentence without a background corpus of knowledge about
       | the world:
       | 
       | > I dropped the hammer on the table and it smashed
       | 
       | > I dropped the vase on the table and it smashed
       | 
       | Exact same grammatical structure but if you switch the noun the
       | way you parse the sentence changes. This is why machine learning
       | with huge datasets wins in NLP, translation etc.
        
         | schoen wrote:
         | As you may know, the AI test based on these ambiguities is
         | called the Winograd Schema Challenge:
         | 
         | https://en.wikipedia.org/wiki/Winograd_Schema_Challenge
        
           | eindiran wrote:
           | There is also the GLUE benchmark, which includes ambiguity
           | handling but also includes other NLU tasks:
           | https://gluebenchmark.com/
        
         | dariusj18 wrote:
         | Also, the basis of a lot of humor is when the opposite of what
         | you expect is what is intended.
        
       ___________________________________________________________________
       (page generated 2020-08-07 23:00 UTC)