Streaming Chat
Build your chatbot UI using our streaming chat endpoint, which is fully compatible with Vercel AI SDKs such as useChat.
Overview
This endpoint provides a streaming chat mechanism that works seamlessly with the Vercel AI SDK. After creating a token for a user, you can leverage this endpoint to power real-time chat interfaces using frameworks like React and the Vercel AI SDK’s useChat hook.
Endpoint
-
HTTP Method: GET
-
URL:
/chat/{token}/stream
Replace
{token}
with the actual chat token assigned to the user.
Features
-
Real-time streaming: Stream chat messages as they are generated.
-
Vercel AI SDK Compatible: Integrates easily with tools like
ai/react
for a reactive chat interface. -
Robust Error Handling: Option for keeping the last message on error.
Request
Path Parameter
Name | Required | Type | Description |
---|---|---|---|
token | Yes | string | The unique chat session token assigned to the user |
HTTP Headers
accept: application/json
No body content is required for this GET request.
Example Request
Response
The response will be a real-time stream of chat messages. The format is dynamically generated with each message containing the following fields:
role: Identifies the sender (e.g., user, assistant). content: The text/content of the message. timestamp: ISO 8601 formatted timestamp marking when the message was sent. Code Example Below is an example of how to use this endpoint in a React component with the Vercel AI SDK’s useChat hook:
Additional Notes
Vercel AI SDK Integration: For more details on integration with Vercel AI SDKs, visit the Vercel AI SDK documentation.
Error Handling: The keepLastMessageOnError option ensures that if an error occurs while streaming, the last successful message is preserved until the issue is resolved. Customization: Feel free to customize the chat UI and error handling logic according to your specific requirements.