Next.js SDK Documentation
The Chatbotkit Next.js SDK provides seamless integration of AI-powered chat capabilities into Next.js applications. It offers built-in support for Edge Runtime, streaming responses, and React components for rapid development.
Key Features
- Edge Runtime compatibility
- Server-side streaming support
- TypeScript support
- Real-time response streaming
Installation
# Install core SDK and Next.js integration npm install @chatbotkit/sdk @chatbotkit/next @chatbotkit/react # or using yarn yarn add @chatbotkit/sdk @chatbotkit/next @chatbotkit/react
shell
Server-Side Integration
Basic API Route Setup
Create a server-side API route to handle chat conversations and stream responses using the ChatBotKit SDK with Next.js Edge Runtime support.
// app/api/conversation/route.ts import { ChatBotKit } from '@chatbotkit/sdk' import { stream } from '@chatbotkit/next/edge' const cbk = new ChatBotKit({ secret: process.env.CHATBOTKIT_API_SECRET, }) export async function POST(req: Request) { const { messages } = await req.json() return stream(cbk.conversation.complete(null, { messages, model: 'gpt-4', temperature: 0.7 })) }
typescript
Configuration Options
The server-side integration supports various configuration options:
type CompleteOptions = { messages: Message[] model?: string temperature?: number maxTokens?: number stopSequences?: string[] context?: string stream?: boolean }
typescript
Best Practices
- Error Handling
- Implement proper error boundaries
- Handle network errors gracefully
- Provide feedback for rate limiting
- Performance
- Use Edge Runtime for better performance
- Implement proper loading states
- Consider message batching for long conversations
- User Experience
- Show typing indicators
- Implement retry mechanisms
- Provide clear feedback for errors
- Security
- Validate user input
- Implement rate limiting
- Secure API endpoints
TypeScript Support
The SDK includes comprehensive TypeScript definitions:
interface Message { id: string type: 'user' | 'bot' text: string metadata?: Record<string, any> } interface ConversationManagerOptions { endpoint: string initialMessages?: Message[] onError?: (error: Error) => void onStreamStart?: () => void onStreamToken?: (token: string) => void onStreamComplete?: () => void }
typescript
Examples
Complete Chat Implementation
import { useConversationManager, AutoTextarea } from '@chatbotkit/react' export default function Chat() { const { thinking, text, setText, messages, submit, error } = useConversationManager({ endpoint: '/api/chat', initialMessages: [ { id: 'welcome', type: 'bot', text: 'Hello! How can I help you today?' } ] }) const handleSubmit = async () => { if (!text.trim()) return await submit() } return ( <div className="chat-container"> <div className="messages"> {messages.map(({ id, type, text }) => ( <div key={id} className={`message ${type}`}> {text} </div> ))} {thinking && <div className="thinking">Bot is typing...</div>} {error && <div className="error">{error.message}</div>} </div> <div className="input-container"> <AutoTextarea value={text} onChange={(e) => setText(e.target.value)} onKeyDown={(e) => { if (e.key === 'Enter' && !e.shiftKey) { e.preventDefault() handleSubmit() } }} placeholder="Type your message..." maxRows={5} className="chat-input" /> <button onClick={handleSubmit} disabled={thinking || !text.trim()} > Send </button> </div> </div> ) }
typescript
This documentation provides a comprehensive guide to implementing Chatbotkit in a Next.js application, with particular focus on the streaming capabilities and Edge Runtime support. Would you like me to expand on any particular section or add more specific examples?