Search code examples
curllarge-language-modelgoogle-gemini

How to Include Chat History When Using Google Gemini's API


I'm trying to feed chat history to the Google Gemini API using a cURL request. I want to provide both the user's previous input and the model's previous response in the request. Here's the cURL command I'm using:

curl --location 'https://generativelanguage.googleapis.com/v1beta/models/gemini-1.5-flash:generateContent?key=' \
--header 'Content-Type: application/json' \
--data '{
    "contents": [
        {
            "role": "model",
            "parts": [
                {
                    "text": "...model data"
                }
            ]
        },
        {
            "role": "user",
            "parts": [
                {
                    "text": "...user input"
                }
            ]
        }
    ],
    "history": [
        {
            "role": "user",
            "parts": [
                {
                    "text": "Hello, I have 2 dogs in my house."
                }
            ]
        },
        {
            "role": "model",
            "parts": [
                {
                    "text": "Great to meet you. What would you like to know?"
                }
            ]
        }
    ]
}'

I'm not sure if I'm formatting the contents and history sections correctly.

I expected the model to understand the previous exchanges and continue the conversation accordingly.


Solution

  • You have to include your chats to the "parts" list to make gemini to remember their chat history.

    Example data:

    {
        "contents": [
            {
                "parts": [
                    {"text": "User: Hello"},
                    {"text": "User: What was my previous message?"}
                ]
            }
        ]
    }
    

    Response:

    {
      "candidates": [
        {
          "content": {
            "parts": [
              {
                "text": "Your previous message was: \"Hello\"\n"
              }
            ],
            "role": "model"
          },
          "finishReason": "STOP",
          "avgLogprobs": -0.08453522125879924
        }
      ],
      "usageMetadata": {
        "promptTokenCount": 11,
        "candidatesTokenCount": 9,
        "totalTokenCount": 20
      },
      "modelVersion": "gemini-1.5-flash"
    }
    
    for interactive mode you have to include into contents list to make gemini to remember their chat history.:

    Example data:

    {
            "contents": [
              {"role":"user",
               "parts":[{
                 "text": "Hello"}]},
              {"role": "model",
               "parts":[{
                 "text": "Great to meet you. What would you like to know?"}]},
              {"role":"user",
               "parts":[{
                 "text": "What was my previous message"}]},
            ]
    }
    

    Response:

    {
      "candidates": [
        {
          "content": {
            "parts": [
              {
                "text": "Your previous message was \"Hello\".\n"
              }
            ],
            "role": "model"
          },
          "finishReason": "STOP",
          "avgLogprobs": -0.30310314893722534
        }
      ],
      "usageMetadata": {
        "promptTokenCount": 21,
        "candidatesTokenCount": 8,
        "totalTokenCount": 29
      },
      "modelVersion": "gemini-1.5-flash"
    }