[HN Gopher] Explaining RNNs without neural networks
       ___________________________________________________________________
        
       Explaining RNNs without neural networks
        
       Author : parrt
       Score  : 45 points
       Date   : 2020-07-10 19:00 UTC (4 hours ago)
        
 (HTM) web link (explained.ai)
 (TXT) w3m dump (explained.ai)
        
       | parrt wrote:
       | Vanilla recurrent neural networks (RNNs) form the basis of more
       | sophisticated models, such as LSTMs and GRUs. There are lots of
       | great articles, books, and videos that describe the
       | functionality, mathematics, and behavior of RNNs so, don't worry,
       | this isn't yet another rehash. (See below for a list of
       | resources.) My goal is to present an explanation that avoids the
       | neural network metaphor, stripping it down to its essence--a
       | series of vector transformations that result in embeddings for
       | variable-length input vectors.
        
       ___________________________________________________________________
       (page generated 2020-07-10 23:00 UTC)