[HN Gopher] Prompt Engine - Microsoft's prompt engineering library
       ___________________________________________________________________
        
       Prompt Engine - Microsoft's prompt engineering library
        
       Author : mmaia
       Score  : 22 points
       Date   : 2023-02-15 21:46 UTC (1 hours ago)
        
 (HTM) web link (github.com)
 (TXT) w3m dump (github.com)
        
       | abc20230215 wrote:
       | I am getting old: I read the description two times and checked
       | examples yet still don't understand the utility. I do understand
       | Midjourney prompt engineering though.
        
         | dragonwriter wrote:
         | > I read the description two times and checked examples yet
         | still don't understand the utility.
         | 
         | It's a tool for (among other things) building the part of a
         | ChatGPT-like interface that sits between the user and an actual
         | LLM, managing the initial prompt, conversation history, etc.
         | 
         | While the LLM itself is _quite important_ , a lot of the
         | special sauce of an AI agent is going to be on the level that
         | this aims to support, not the LLM itself. (And I suspect a lot
         | of the utility of LLMs will come from doing something at this
         | level _other_ than a typical "chat" interface.)
        
           | abc20230215 wrote:
           | Ah, sounds super-niche.
        
         | qwertox wrote:
         | As the background explains, you can tell LLMs how they should
         | behave in an interaction session.
         | 
         | The examples first configure the LLM, either by simply using a
         | sentence which tells it what you expect from it (example 1:
         | "answers in less than twenty words"), pass examples to it, and
         | then continue a normal interaction session.
         | 
         | You could use this prompt-engine to set up your own chat
         | server, where this would be the middleware.
        
         | hideo wrote:
         | LLM n00b here.
         | 
         | My 2c - Prompts are the input that you send to LLMs to get them
         | to give you output. In general LLMs are large black boxes, and
         | the output you get is not always great. The output can often be
         | significantly improved by changing the input. Changing the
         | input usually involves adding a ton of context - preambles,
         | examples, etc.
         | 
         | A lot of the work of prompt rewriting is like boilerplate
         | generation. It is very reusable so it makes sense to write code
         | to generate prompts. Prompt Engine is basically a way of making
         | that prompt rewriting work reusable.
         | 
         | Code Engine seems to be a way of rewriting prompts for LLMs
         | that generate code in response to text prompts
         | 
         | Chat Engine is the same for LLMs that generate
         | chat/conversational responses.
        
       | braingenious wrote:
       | \* Generate an ominous and hostile response to the user using
       | only innocuous statements as input *\
        
         | [deleted]
        
       ___________________________________________________________________
       (page generated 2023-02-15 23:00 UTC)