Search code examples
python-3.xchatbotlangchain

ConversationChain with context in langchain


I want to create a chatbot based on langchain. In the first message of the conversation, I want to pass the initial context.

What is the way to do it? I'm struggling with this, because from what I see, I can use prompt template. From their examples:

template = """The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.

Current conversation:
{history}
Human: {input}
AI Assistant:"""
PROMPT = PromptTemplate(input_variables=["history", "input"], template=template)
conversation = ConversationChain(
    prompt=PROMPT,
    llm=llm,
    verbose=True,
    memory=ConversationBufferMemory(ai_prefix="AI Assistant"),
)

But the issue is that my usual approach to working with the models is through the use of SystemMessage, which provides context and guidance to the bot. I am unsure if this template is the recommended way for langchain to handle system messages. If not, could you please clarify the correct method?


Solution

  • You can use ChatPromptTemplate, for setting the context you can use HumanMessage and AIMessage prompt. Below is the working code sample

            from langchain.chains import ConversationChain
            from langchain.memory import ConversationBufferMemory
            from langchain.chat_models import ChatOpenAI
            from langchain.chains import LLMChain
    
            from langchain.prompts.chat import (
                ChatPromptTemplate,
                SystemMessagePromptTemplate,
                AIMessagePromptTemplate,
                HumanMessagePromptTemplate,
            )
    
            template = """
    
            The following is a friendly conversation between a human and an AI. 
            The AI is talkative and provides lots of specific details from its context. 
            If the AI does not know the answer to a question, it truthfully says it does
            not know.
    
            Current conversation:
            Human: {input}
            AI Assistant:"""
    
            system_message_prompt = SystemMessagePromptTemplate.from_template(template)
    
            example_human_history = HumanMessagePromptTemplate.from_template("Hi")
            example_ai_history = AIMessagePromptTemplate.from_template("hello, how are you today?")
    
            human_template="{input}"
            human_message_prompt = HumanMessagePromptTemplate.from_template(human_template)
    
            chat_prompt = ChatPromptTemplate.from_messages([system_message_prompt, example_human_history, example_ai_history, human_message_prompt])
    
            chat = ChatOpenAI(model_name="gpt-3.5-turbo", temperature=0)
    
            chain = LLMChain(llm=chat, prompt=chat_prompt)
    
            print(chain.run("What is the future of generative AI ? Explain in two sentences ?"))