I am trying to use my llama2 model (exposed as an API using ollama). I want to chat with the llama agent and query my Postgres db (i.e. generate text to sql). I was able to find langchain code that uses open AI to do this. However, I am unable to find anything out there which fits my situation.
Any pointers will be of great help.
Code with openai
# Create connection to postgres
import psycopg2 # Import the library
database = 'postgres'
username = 'postgres'
password = 'password'
server = 'localhost'
port = '5432'
# Establish the connection
conn = psycopg2.connect(
dbname=database,
user=username,
password=password,
host=server,
port=port
)
db = SQLDatabase.from_uri(
"postgresql://postgres:password@localhost:5432/postgres")
toolkit = SQLDatabaseToolkit(db=db, llm=OpenAI(temperature=0))
agent_executor = create_sql_agent(
llm=OpenAI(temperature=0),
toolkit=toolkit,
verbose=True,
agent_type=AgentType.ZERO_SHOT_REACT_DESCRIPTION,
)
agent_executor.run("Describe the transaction table")
I want to make the above code work for my llama2 model exposed via an API at localhost:11434/api/generate
Load your llm like mentioned here https://python.langchain.com/docs/integrations/llms/ollama
and then use that inplace of openai. You'll most probably have to change the prompts to fit llama2 desired format