Context7

View as Markdown

Context7 is an MCP server that pulls live, version-specific documentation from indexed libraries and injects it directly into your AI coding assistant’s prompt. This means when you ask Cursor or Claude to help you build with Smallest AI, it references current API docs — not stale training data.

Setup

1

Install the Context7 MCP server

Follow the Context7 installation guide to add the MCP server to your preferred AI coding assistant (Cursor, Claude Code, Windsurf, etc.).

2

Use it in your prompts

Add use context7 to the end of any prompt involving Smallest AI:

Stream audio from Smallest AI Lightning TTS using WebSockets in Python. use context7
Transcribe audio in real time using Smallest AI Pulse STT. use context7

Context7 will automatically fetch the latest Smallest AI documentation and include it in the context sent to the LLM.

What gets indexed

Context7 indexes the full Smallest AI developer documentation, including:

  • Text to Speech (Lightning) — streaming, WebSocket, HTTP, voice configuration
  • Speech to Text (Pulse) — real-time and pre-recorded transcription, features
  • Authentication, rate limits, and API reference
  • Integration guides (Pipecat, LiveKit, Vercel AI SDK, and more)

Find the Smallest AI library on Context7 at context7.com/llmstxt/smallest_ai_llms_txt.