Search code examples
pythonlangchaingoogle-cloud-vertex-aipy-langchain

Callback in LLM chain doesn't get executed


The following code do not do what it is supposed to do:

from langchain.callbacks.base import BaseCallbackHandler
from langchain import PromptTemplate
from langchain.chains import LLMChain
from langchain.llms import VertexAI


class MyCustomHandler(BaseCallbackHandler):
    def on_llm_end(self, event, context):
        print(f"Prompt: {event.prompt}")
        print(f"Response: {event.response}")


llm = VertexAI(
            model_name='text-bison@001',
            max_output_tokens=1024,
            temperature=0.3,
            verbose=False)
prompt = PromptTemplate.from_template("1 + {number} = ")
handler = MyCustomHandler()
chain = LLMChain(llm=llm, prompt=prompt, callbacks=[handler])
response = chain.run(number=2)
print(response)

Based on this documentation and this tutorial, the code should execute the custom handler callback on_llm_end but in fact it doesn't. Can anyone please tell me why?


Solution

  • I did some research and found the solution.

    You need to pass callback parameter to llm itself. In your case you need to change the code as below

    callback_handler  = MyCustomHandler()
    llm = VertexAI(
                   model_name='text-bison@001',
                   max_output_tokens=1024,
                   temperature=0.3,
                   callbacks=[callback_handler]
                   verbose=False)
    

    Secondly change the implementation of on_llm_end as below

    class MyCustomHandler(BaseCallbackHandler):
        def on_llm_end(self, response, **kwargs):
            print(f"Response: {response}")
    

    This should fix the problem.