In addition to pagination, the ChatBotKit API also supports streaming for certain actions. Streaming allows the caller to receive data in bite-sized chunks and continues until the stream is depleted. This can be useful for efficiently pulling large quantities of data from the API.

To activate streaming, the caller must supply an Accept header with the value application/jsonl. When using this streaming method, the information from the API will be delivered in a continuous stream of JSON lines.

Streaming Example: Conversation List

To stream data using the /v1/conversation/list route, you can make the following HTTP request:

GET /v1/conversation/list HTTP/1.1 Host: Authorization: Token {your_token_here} Accept: application/jsonl

This request specifies the Accept header with the value application/jsonl, indicating that the response should be streamed in JSON line format. The API will continuously send JSON lines until all the data has been retrieved.

Streaming Example: Tokens and Events

Streaming can also be used to receive real-time updates and events from the ChatBotKit conversational AI engine. By streaming the /v1/conversation/{conversationId}/complete route, you can receive tokens and other events related to a specific conversation.

To stream tokens and events, you need to make a POST request to the /v1/conversation/{conversationId}/complete route. The request should include a JSON payload with a text parameter corresponding to the user's input:

POST /v1/conversation/{conversationId}/complete HTTP/1.1 Host: Authorization: Token {your_token_here} Content-Type: application/json { "text": "User input goes here" }

This request will initiate the streaming of tokens and events related to the specified conversation. The API will continuously send updates until the conversation is completed or terminated.

Streaming can be a powerful alternative to pagination, allowing you to receive data in real-time and efficiently handle large quantities of information.