I'm trying to stream the response from the OpenAI streamed response and display the response to the user in a chat interface. However, while each chunk of the response itself is correct my state ends up with duplicated words... Where am I going wrong?
Here is my code along with screenshots showing both the UI and console output:
useEffect(() => {
const doCompletion = async () => {
if (!newMessage) return;
try {
await handleCompletion(messages);
setNewMessage(false);
} catch (err) {
console.log("unable to communicate with model service: ", err)
}
}
doCompletion();
}, [newMessage]);
const handleNewMessage = async (formData: FormData) => {
const message = formData.get("message") as string;
addMessage({id: uuidv4(), data: {role: "user", content: message}});
setNewMessage(true);
}
const handleCompletion = async (messages: Message[]) => {
const formattedMessages = messages.map((message: Message) => message.data)
const stream = await openai.chat.completions.create({
messages: formattedMessages,
model: "deepseek-chat",
stream: true
});
for await (const chunk of stream) {
const content = chunk.choices[0]?.delta?.content;
console.log("content: ", content)
setMessages((prev) => {
let tmpMessages = [...prev];
const lastMessageIndex = tmpMessages.length-1;
if (tmpMessages[lastMessageIndex].data.role !== "assistant") {
tmpMessages.push({id: chunk.id, data: {role: "assistant", content: ""}});
return tmpMessages;
}
const prevContent = tmpMessages[lastMessageIndex].data.content;
console.log("prevContent: ", prevContent)
const newContent = prevContent! + content!;
console.log("newContent: ", newContent)
tmpMessages[lastMessageIndex].data.content = newContent;
return tmpMessages;
});
}
}
Let's try to fix it!
Your problem is that when streaming OpenAI responses in React, rapid chunk updates can lead to stale state, causing duplicated words in the displayed chat.
My suggestion is to use a useRef
to store the latest cumulative content from the OpenAI stream. This ensures state updates are based on the most current data.
import React, { useState, useEffect, useRef } from 'react';
const YourComponent = () => {
const [messages, setMessages] = useState([]);
const assistantContentRef = useRef('');
const handleCompletion = async (messages) => {
// OpenAI stream setup
assistantContentRef.current = ''; // Reset ref
for await (const chunk of stream) {
const content = chunk.choices[0]?.delta?.content;
if(content){
assistantContentRef.current += content;
setMessages((prev) => {
// update messages using assistantContentRef.current
let tmpMessages = [...prev];
const lastMessageIndex = tmpMessages.length - 1;
if (tmpMessages[lastMessageIndex]?.data.role !== 'assistant') {
tmpMessages.push({
id: chunk.id,
data: { role: 'assistant', content: assistantContentRef.current },
});
} else {
tmpMessages[lastMessageIndex].data.content = assistantContentRef.current;
}
return tmpMessages;
});
}
}
};
// rest of your code
};
export default YourComponent;
This approach guarantees that your React state always reflects the most recent AI response, preventing duplication. :)