I've made a simple OpenAI API example with function calling. I'm only using function calling to format the response, I'm not calling multiple functions or any external APIs.
When I don't stream the response I can return the function arguments, which is the data that I need.
In my NextJS route handler:
export async function POST(request: Request) {
try {
const openai = new OpenAI({
apiKey: process.env["OPENAI_API_KEY"],
});
const response = await openai.chat.completions.create({
model: "gpt-4",
// stream: true,
messages: [
{
role: "user",
content: "Give me 5 questions and answers for a pub quiz",
},
],
tools: [
{
type: "function",
function: {
name: "get_questions_and_answers",
description: "Get questions and answers",
parameters: simpleJsonSchema,
},
},
],
tool_choice: {
type: "function",
function: { name: "get_questions_and_answers" },
},
});
return Response.json(
JSON.parse(
response.choices[0].message.tool_calls?.[0].function.arguments || "",
),
);
} catch (serverError) {
console.error({ serverError });
throw new Error();
}
}
simpleJsonSchema.json:
{
"type": "object",
"properties": {
"getQuestions": {
"type": "array",
"items": {
"type": "object",
"properties": {
"Question": {"type": "string"},
"Answer": {"type": "string"}
},
"required": ["Question", "Answer"]
}
}
},
"required": ["getQuestions"]
}
Response from API:
{"getQuestions":[{"Question":"What is the capital of Australia?","Answer":"Canberra"},{"Question":"Who wrote 'To Kill a Mockingbird'?","Answer":"Harper Lee"},{"Question":"What is the highest peak in the world?","Answer":"Mount Everest"},{"Question":"Who is known as the 'Father of Computers'?","Answer":"Charles Babbage"},{"Question":"What is the largest ocean in the world?","Answer":"Pacific Ocean"}]}
This is fine when developing locally, however when deployed to Vercel the request sometimes times out. I've tried to add streaming as this is the recommended solution:
const response = await openai.chat.completions.create({
model: "gpt-4",
stream: true,
messages: [
{
role: "user",
content: "Give me 5 questions and answers for a pub quiz",
},
],
tools: [
{
type: "function",
function: {
name: "get_questions_and_answers",
description: "Get questions and answers",
parameters: simpleJsonSchema,
},
},
],
tool_choice: {
type: "function",
function: { name: "get_questions_and_answers" },
},
});
const stream = OpenAIStream(response);
return new StreamingTextResponse(stream);
However now the response has a lot of unnecessary data. And when I try to JSON.parse on the client I get errors.
Response from API:
{"tool_calls":[ {"id": "call_IhxvzkZ5EsmZpHc6tOznTmzb", "type": "function", "function": {"name": "get_questions_and_answers", "arguments": "{\n \"getQuestions\": [\n {\n \"Question\": \"Question 1\",\n \"Answer\": \"Answer 1\"\n },\n {\n \"Question\": \"Question 2\",\n \"Answer\": \"Answer 2\"\n },\n {\n \"Question\": \"Question 3\",\n \"Answer\": \"Answer 3\"\n },\n {\n \"Question\": \"Question 4\",\n \"Answer\": \"Answer 4\"\n },\n {\n \"Question\": \"Question 5\",\n \"Answer\": \"Answer 5\"\n }\n ]\n}"}}
As far as I can see the docs only cover using useChat
but I have some particular requirements so I need to handle the fetching and form state myself: https://sdk.vercel.ai/docs/api-reference/use-chat
Why am I getting invalid JSON?
Here is a repository which reproduces the issue: https://github.com/jameschetwood/openai-function-calling
this is the response you are getting:
{"tool_calls":[ {"id": "call_HRxqlP3yzeHsoN43tMyZjMlr", "type": "function", "function": {"name": "get_questions_and_answers", "arguments": "{\n \"getQuestions\": [\n {\n \"Question\": \"What is the capital city of France?\",\n \"Answer\": \"Paris\"\n },\n {\n \"Question\": \"Who painted the Mona Lisa?\",\n \"Answer\": \"Leonardo da Vinci\"\n },\n {\n \"Question\": \"What is the largest planet in our solar system?\",\n \"Answer\": \"Jupiter\"\n },\n {\n \"Question\": \"What is the national flower of England?\",\n \"Answer\": \"Rose\"\n },\n {\n \"Question\": \"Which country is famous for its tulips?\",\n \"Answer\": \"Netherlands\"\n }\n ]\n}"}}
I used https://jsoneditoronline.org/ to auto correct the json and it just adds "]}". for some reason openai is not sending correct json response. you have to add it
accumulatedText += "]}";
then response works:
this is too specific error. if openai updates its response api, it might send the json data correctly. so a better approach would be parsing in try/catch
try {
const parsed = JSON.parse(accumulatedText);
console.log({ parsed });
} catch (error) {
// you should error for each specific case
accumulatedText += "]}";
console.log("correct accumulatedText in catch block", accumulatedText);
}