Here is my code:
from langchain_core.prompts import ChatPromptTemplate
from langchain_ollama import ChatOllama
from langchain_core.output_parsers import StrOutputParser
llm = ChatOllama(
model = 'llama3.2',
temperature = 0
)
chat_template = ChatPromptTemplate.from_messages(
[
('system', "you have to give two line definition of the word given by user"),
('human', 'the word is {user_input}')
]
)
message = chat_template.format_messages(user_input = 'backlog')
llm.invoke(message)
chain = chat_template | llm | StrOutputParser()
chain.invoke({'user_input' : 'backlog'})
And it is showing connect error:
httpx.ConnectError: [WinError 10061] No connection could be made because the target machine actively refused it
How can I fix this? I was trying to create basic word meaning chatbot using langchain.
this means that Ollama is not running on you machine.
after Ollama pull <model name>
you need to run Ollama serve