It appears that the function calling response is quite unreliable, as it does not return a complete arguments object. Here is an example of the JSON Schema for a function that I'm trying to get it to call.
export const getCoinPrice: ChatCompletionFunctions = {
name: "get_coin_price",
description: "Get the current price for any cryptocurrency",
parameters: {
type: "object",
properties: {
coin_symbol: {
type: "string",
description: "The symbol of the asset to get the price for. Eg: BTC",
},
},
required: ["coin_symbol"],
},
};
This is then passed to the chat request like so:
const chatRequest: CreateChatCompletionRequest = {
model: "gpt-3.5-turbo-0613",
temperature: 0.5,
n: 1,
stop: "\n",
messages: messages,
function_call: "auto",
functions: functions,
};
const response = await this.openai.createChatCompletion(chatRequest);
The Contents of the message response will look like so:
{"role":"user","content":"what is the current price of bitcoin"},{"role":"assistant","content":"","function_call":{"name":"get_coin_price","arguments":"{"}
Arguments should be of type object but I couldn't get it to return anything other than "{"
It can't be an issue with how I'm passing it the message since I'm using the types provided by the package:
input: string,
chatHistory: ChatCompletionRequestMessage[],
functions: ChatCompletionFunctions[]
I can't seem to find the source of the issue so any help or suggestions would be greatly appreciated. Thanks!
I had a similar issue, i cross referenced to see what similarities we had, and it seems like getting rid of the stop rule works
"function_call": {
"name": "get_coin_price",
"arguments": "{\n \"coin_symbol\": \"BTC\"\n}"
}