[HN Gopher] X-Transformers: A fully-featured transformer with ex...
       ___________________________________________________________________
        
       X-Transformers: A fully-featured transformer with experimental
       features
        
       Author : blackcat201
       Score  : 31 points
       Date   : 2021-05-08 17:38 UTC (5 hours ago)
        
 (HTM) web link (github.com)
 (TXT) w3m dump (github.com)
        
       | krick wrote:
       | That's really cool. Now I need a bunch of pre-trained models for
       | this...
        
       | shayankh wrote:
       | absolutely fucking amazing
        
       | argvargc wrote:
       | Unfortunately for me, I genuinely thought this was going to be a
       | DIY robot build that could disguise itself as something else.
        
         | giords wrote:
         | *for us
        
       | mrfusion wrote:
       | Explain like I'm a first year CS major?
        
         | thesehands wrote:
         | Transformers suffer from a quadratic bottleneck when
         | calculating attention. Much work has been done investigating
         | where memory can be saved by being more explicit on which
         | attentions to calculate. This repo implements transformers with
         | noted improvements
        
         | erik_seaberg wrote:
         | It's a kind of machine learning model:
         | https://medium.com/inside-machine-learning/what-is-a-transfo...
        
       | adontz wrote:
       | I have expected to see a 3D model for Optimus Prime.
        
       | fao_ wrote:
       | As others have mentioned, anything obscure like this should
       | literally come with a Wikipedia (or other such) link to explain
       | what it is, what it does. This is the primary problem with small
       | project READMEs, imo. They assume you're already familiar with
       | them and know what the hell they are. Like, take Ironhide:
       | https://github.com/MrMEEE/ironhide         Optimus Support for
       | Linux Through VirtualGL - PPA version also available
       | 
       | That's... great. So it's doing something with GL, and it's
       | running on Linux, but uhhh.                   my branch of the
       | original bumblebee project..
       | 
       | What is Optimus? What is Bumblebee? The trick of it is that it
       | links to a blog where neither of these terms are ever explained.
       | Maybe it's to just look impressive on someone's CV? How could I
       | even tell the difference?
       | 
       | Likewise for this project, all you need in the README is one line
       | that's like:                  X-Transformers is a re-
       | implementation of Machine Learning Transformers that has been
       | built based on experimental Arxiv papers
       | 
       | It's a one-line fix but it'll stop people like me being confused
       | as to whether or not you're implementing a new HTTP header
        
         | nerdponx wrote:
         | I would agree if this were some kind of public release
         | announcement.
         | 
         | Would you say the same about an experimental programming
         | language based on a bunch of recent PLT/CS research?
         | 
         | That's pretty much what this is, but for machine learning. It's
         | not meant to be "for the public", it's effectively research.
        
           | enchiridion wrote:
           | We're talking 2-3 sentences to explain what's going on.
           | 
           | A researcher releasing work publicly on github is presumably
           | doing so too spread the ideas.
        
             | skybrian wrote:
             | Sure, they have some audience in mind, but not necessarily
             | us. There are a lot of documents that are public but are
             | meant for a specialized audience. If they were meant for
             | the general public, they'd be written very differently.
             | 
             | Sharing a link on Hacker News is effectively taking it out
             | of context and sometimes it's up to us to add that context
             | back in. The author doesn't owe it to us.
        
       ___________________________________________________________________
       (page generated 2021-05-08 23:00 UTC)