Getting Started

View as MarkdownOpen in Claude

This guide walks you through installing the SDK, writing your first intelligent agent, and running it.

Prerequisites

OpenAI API Key required. Set it as an environment variable before running your agent:

$export OPENAI_API_KEY="your-key-here"

Installation

$pip install smallestai

Write Your First Agent

Create two files: one for the agent logic, and one to run the application.

1

Create my_agent.py

Subclass OutputAgentNode and implement generate_response() to stream LLM output.

my_agent.py
1import os
2from smallestai.atoms.agent.nodes import OutputAgentNode
3from smallestai.atoms.agent.clients.openai import OpenAIClient
4
5class MyAgent(OutputAgentNode):
6 def __init__(self):
7 super().__init__(name="my-agent")
8 self.llm = OpenAIClient(
9 model="gpt-4o-mini",
10 api_key=os.getenv("OPENAI_API_KEY")
11 )
12
13 async def generate_response(self):
14 response = await self.llm.chat(
15 messages=self.context.messages,
16 stream=True
17 )
18 async for chunk in response:
19 if chunk.content:
20 yield chunk.content
2

Create main.py

Wire up AtomsApp with a setup_handler that adds your agent to the session.

main.py
1from smallestai.atoms.agent.server import AtomsApp
2from smallestai.atoms.agent.session import AgentSession
3from my_agent import MyAgent
4
5async def on_start(session: AgentSession):
6 session.add_node(MyAgent())
7 await session.start()
8 await session.wait_until_complete()
9
10if __name__ == "__main__":
11 app = AtomsApp(setup_handler=on_start)
12 app.run()

Your entry point can be named anything (app.py, run.py, etc.). When deploying, specify it with --entry-point your_file.py.

Want a greeting? Use @session.on_event to speak when the user joins:

1@session.on_event("on_event_received")
2async def on_event(_, event):
3 if isinstance(event, SDKSystemUserJoinedEvent):
4 agent.context.add_message({"role": "assistant", "content": "Hello!"})
5 await agent.speak("Hello! How can I help?")

Adding the greeting to context ensures the LLM knows the conversation has started.

Run Your Agent

Once your files are ready, you have two options:

For development and testing, run the file directly:

$python main.py

This starts a WebSocket server on localhost:8080. In a separate terminal, connect to it:

$smallestai agent chat

No account or deployment needed.

What’s Next?