Command Palette

Search for a command to run...

YepAPI
Free · All Tools

AI Chatbot Agent Skill

Chat UI components, streaming responses with AI SDK, context window management, and RAG patterns.

aichatbotstreamingrag

The Skill

Full content, every format. Copy it, download it, or install with one command.

SKILL.md
---
description: Chat UI components, streaming responses with AI SDK, context window management, and RAG patterns.
homepage: https://yepapi.com/skills/ai-chatbot
metadata:
  tags: [ai, chatbot, streaming, rag]
---

# AI Chatbot

## Rules

- Use Vercel AI SDK (`ai` package) for streaming — `useChat()` hook on client, `streamText()` on server
- Stream responses with `text/event-stream` — never buffer full responses before sending
- Message format: `{ role: "user" | "assistant" | "system", content: string }`
- Context window management: trim oldest messages when approaching token limit, always keep system prompt
- RAG pattern: embed user query → vector search → inject top-k results as context before the user message
- Chat UI: auto-scroll to bottom, loading indicator with animated dots, render markdown in assistant messages
- Persist conversation history server-side — use a `messages` table with `chatId`, `role`, `content`, `createdAt`
- System prompt goes first in the messages array — never let user messages override it
- YepAPI chat endpoint: `POST /v1/ai/chat` with `{ messages, model?, stream? }`
- Stop generation button — use `AbortController` to cancel in-flight streams

## Patterns

```tsx
// Client — useChat from Vercel AI SDK
const { messages, input, handleInputChange, handleSubmit, isLoading, stop } = useChat({
  api: "/api/chat",
});
```

```ts
// Server — streaming response
import { streamText } from "ai";

export async function POST(req: Request) {
  const { messages } = await req.json();
  const result = streamText({ model, messages });
  return result.toDataStreamResponse();
}
```

## Avoid

- Sending entire conversation history without trimming — will exceed context window
- Blocking UI while waiting for full response — always stream
- Storing messages only in client state — persist server-side for history
- Hardcoding model names — use environment variables or config

Install

Why Use the AI Chatbot Skill?

Without this skill, your AI guesses at ai chatbot patterns. It might hallucinate deprecated APIs, use outdated conventions, or miss best practices entirely. With it, your AI follows a proven ruleset — every suggestion aligns with current standards.

Drop this skill into your project and your AI instantly knows the rules. Better code suggestions, fewer errors, faster shipping.

Try These Prompts

These prompts work better with the AI Chatbot skill installed. Your AI knows the context and writes code that fits.

"Build a chat UI with streaming responses using the Vercel AI SDK"

"Create a RAG chatbot that answers questions from uploaded documents"

"Add a context-aware support chatbot with conversation history and suggested replies"

AI Chatbot skill — FAQ

It includes chat UI components, streaming response patterns with AI SDK, context window management, and RAG implementation. Your AI builds production-ready chatbot interfaces with proper state handling.

Run `npx skills add YepAPI/skills --skill ai-chatbot` in your project root. This copies the skill file into your repo where your AI coding tool can read it automatically.

Yes. It provides patterns for the Vercel AI SDK including useChat, streaming responses, and message history management. It also covers RAG patterns for context-aware responses.

Want more skills?

Browse all 110 free skills for builders.

See All Skills