我正在构建一个带有集成聊天功能的移动应用程序,该应用程序使用 OpenAI API 或 llama3 作为模型。我设法使流媒体正常工作。但是,当显示正在流式传输的消息时,我遇到了一个小问题。目前,一旦流式传输完成,就会显示消息,而不是像 Chatgpt 或任何其他 LLM 体验那样按块获取消息。这是我当前的代码。 聊天.tsx
const {
messages,
addMessage,
removeMessage,
updateMessage,
setIsMessageUpdating,
} = useContext(MessagesContext);
<FlatList
data={messages}
keyExtractor={(item) => item.id}
renderItem={({ item }) => (
<View
style={styles.messageContainer}
className={cn("flex justify-end", {
"items-end": item.isUserMessage,
})}
>
<View
className={cn(
"flex flex-row gap-y-2 text-sm max-w-[90%] mx-2 overflow-x-hidden"
)}
>
<View
className={cn("px-4 py-2 rounded-lg", {
"bg-primary ": item.isUserMessage,
"bg-secondary ": !item.isUserMessage,
})}
>
<Text
className={cn({
"text-white ": item.isUserMessage,
})}
>
{item.text}
</Text>
</View>
</View>
</View>
)}
contentContainerStyle={styles.messagesList}
/>
这是我处理通过反应查询发送的消息的函数
const { mutate: sendMessage, isPending } = useMutation({
mutationKey: ["sendMessage"],
// include message to later use it in onMutate
mutationFn: async (message: Message) => {
const response = await fetch(
`${process.env.EXPO_PUBLIC_API_URL}/api/mobile/virtual-coach`,
{
method: "POST",
headers: {
"Content-Type": "application/json",
},
body: JSON.stringify({ messages }),
}
);
if (!response.ok) {
throw new Error("Network response was not ok");
}
return response.body;
},
onMutate(message) {
addMessage(message);
},
onSuccess: async (stream) => {
if (!stream) throw new Error("No stream");
// // construct new message to add
const id = customAlphabet(
"1234567890ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz",
10
)();
const responseMessage = {
id,
isUserMessage: false,
text: "",
};
// add new message to state
addMessage(responseMessage);
setIsMessageUpdating(true);
const reader = stream.getReader();
const decoder = new TextDecoder("utf-8");
let done = false;
while (!done) {
const { value, done: doneReading } = await reader.read();
done = doneReading;
const chunkValue = decoder.decode(new Uint8Array([value]), {
stream: true,
});
updateMessage(id, (prev) => prev + chunkValue);
}
// // clean up
setIsMessageUpdating(false);
},
onError: (_, message) => {
console.log("error", _);
removeMessage(message.id);
},
});
我认为这与平面列表重新渲染行为有关。
https://github.com/vercel/ai/discussions/655#discussioncomment-9332890
在 vercel ai 检查此线程。
基本上,RN 在显示流方面受到限制,必须添加一些东西才能显示流消息。