Search code examples
pythonlangchainpydantic-v2

How to fix the warning from Pydantic in langchain: The `dict` method is deprecated; use `model_dump` instead


When I run the RAG chain code with OpenAI from langchain it gives me warning like this:

PydanticDeprecatedSince20: The `dict` method is deprecated; use `model_dump` instead. Deprecated in Pydantic V2.0 to be removed in V3.0. See Pydantic V2 Migration Guide at https://errors.pydantic.dev/2.5/migration/
  warnings.warn('The `dict` method is deprecated; use `model_dump` instead.', DeprecationWarning)

I have not place to replace dict with model_dump and I even have not encode it anywhere in my code. Any idea how to solve this warning?

Here is my code:

from client_setup import get_client
#from langchain_community.vectorstores import Weaviate
#from langchain_openai.OpenAI import OpenAI
from langchain_openai import OpenAI
from langchain_community.vectorstores import Weaviate
from langchain.retrievers.weaviate_hybrid_search import WeaviateHybridSearchRetriever
from langchain.prompts import ChatPromptTemplate
from langchain.schema.runnable import RunnablePassthrough
from langchain.schema.output_parser import StrOutputParser


client = get_client()

retriever = WeaviateHybridSearchRetriever(
        client=client, 
        index_name="Material",
        text_key="su",
        attributes=["material", "heat_treatment", "su", "sy"],
        create_schema_if_missing=True,
        )

llm = OpenAI(api_key="", model_name="gpt-3.5-turbo-instruct")


template = """You are an assistant for question-answering tasks. 
Use the following pieces of retrieved context to answer the question. 
If you don't know the answer, just say that you don't know. 
Use three sentences maximum and keep the answer concise.
Question: {question} 
Context: {context} 
Answer:
"""
prompt = ChatPromptTemplate.from_template(template)

#print(prompt)
query = "What heat treatment is used for Steel SAE 1040?"

rag_chain = (
        {"context": retriever,
         "question": RunnablePassthrough()}
        | prompt
        | llm
        | StrOutputParser()
        )


result = rag_chain.invoke(query)
print(result)

Solution

  • I have the same issue. It's a problem with the langchain implementation itself. If you are using langchain 0.1.0 as I do, then it seems that the Pydantic package it usess is 2.5.2. But then some langchain code (in my case langchain_community/chat_models/openai.py:456) uses a deprecated method from an earlier version of Pydantic.

    The developers should be aware of this and if they ever need to upgrade pydantic to 3.0, they should migrate their code accordingly, for us.

    So for now we are good, unless at some point your code depends on Pydantic 3 but langchain does not and there will be a conflict...