[HN Gopher] Show HN: Graph-based AI for longform writing
       ___________________________________________________________________
        
       Show HN: Graph-based AI for longform writing
        
       Hey everyone, I wanted to share a new tool we've created called
       Jotte (https://jotte.ai) which we believe can be a game-changer for
       AI-generated longform writing like novels and research papers.  As
       you may know, current AI like ChatGPT and GPT-3 have a token limit
       of around 4000 tokens or 3000 words, which limits their
       effectiveness for longer writing tasks. With Jotte, we've developed
       a graph-based approach to summarize information and effectively
       give AI "unlimited" memory.  Jotte remembers recent details like
       the meal a character ate a page ago, while avoiding getting bogged
       down by irrelevant details like the blue curtains mentioned 5
       chapters ago. We've created a proof of concept and would love to
       hear your thoughts on it.  Do you think this approach could lead to
       better longform writing by AI? Let us know in the comments!
        
       Author : Broge
       Score  : 55 points
       Date   : 2023-02-22 19:41 UTC (3 hours ago)
        
 (HTM) web link (jotte.ai)
 (TXT) w3m dump (jotte.ai)
        
       | richdougherty wrote:
       | I would love a tool like this to combine requirements, comments
       | and fragments of code into a larger program.
        
       | howon92 wrote:
       | Congrats on the launch! I'm not your target market but am curious
       | to learn how this gives AI "unlimited" memory. Whenever I try to
       | use GPT-3 API, I'm blocked by the token limit for most practical
       | applications. My two cents for the product itself is it seems
       | more like a tool for developers than novel writers. Have you done
       | any beta testing with your target users?
        
         | Broge wrote:
         | Thanks for the kind words!
         | 
         | The unlimited part comes from the AI knowing just enough
         | context to stay coherent in any situation. Current long-form
         | text techniques usually just summarize the past n tokens, and
         | maybe the previous summary as well. The problem with this is
         | that it quickly loses specifics of anything that happened just
         | outside the window.
         | 
         | What Jotte's graph-based approach does is have weighted
         | summaries, allowing the important information to stay in there
         | much longer.
         | 
         | Definitely agree that the interface is still pretty rough, but
         | we wanted to just get public reception on this sort of thing.
         | We've done some testing on hyrid dev/writers, but it needs a
         | more rigid structure before we even try to test this on pure
         | writers.
        
       | nicolas_17 wrote:
       | What exactly is the use case for "longform writing by AI"?
       | Profiting from mass-produced zero-effort novels?
        
       | Mizza wrote:
       | I am glad to see more stuff with graph based AI here.
       | 
       | I have a running bet with a friend about whether future is going
       | to be OBM (One Big Model) or LoLM (Lots of Little Models). I'm
       | strongly in the LoLM/graph camp and have been working in that
       | direction as well: https://github.com/Miserlou/Helix
        
         | LesZedCB wrote:
         | helix looks amazing! that's exactly the kind of thing i'm
         | looking to burn through openai credits with.
        
         | Broge wrote:
         | Yeah I feel like for development, OBM is great and super
         | flexible.
         | 
         | But when you actually want to deploy, a lot of tiny, more
         | efficient models would probably be the best bet.
         | 
         | I read somewhere that the a company ended up fine-tuning
         | FLAN-T5 instead of going GPT-3, which I can imagine saved them
         | lots of $$.
        
         | danielbln wrote:
         | Seeing how langchain is gaining popularity and development
         | rapidly, I would agree. Chaining lots of specific models and
         | tools seems to be the way forward.
        
       ___________________________________________________________________
       (page generated 2023-02-22 23:00 UTC)