*** title: Defining Tools description: Create functions that your agent can call. ------------------------------------------------------- Tools are Python methods marked with `@function_tool`. The SDK automatically generates the schema the LLM needs based on your function's signature and docstring. ## Basic Tool The `@function_tool()` decorator marks a method as callable by the LLM. ```python from smallestai.atoms.agent.tools import function_tool class MyAgent(OutputAgentNode): @function_tool() def get_order_status(self, order_id: str): """Get the current status of an order. Args: order_id: The order ID, e.g. "ORD-12345" """ order = self.db.get_order(order_id) return {"status": order.status, "eta": order.eta} ``` The LLM sees this as a callable function with a description and typed parameters. ## How It Works 1. **`@function_tool()`** marks the method as a tool 2. **Type hints** tell the LLM what types to pass 3. **Docstring** explains when and how to use the tool 4. **Return value** is sent back to the LLM ## Docstring Format The LLM reads your docstring to understand when to call the tool. ```python @function_tool() def search_products(self, query: str, category: str = None): """Search the product catalog. Use when a customer asks about products or pricing. Args: query: Search terms, e.g. "wireless headphones" category: Optional filter, e.g. "electronics" """ return self.catalog.search(query, category=category) ``` | Part | Purpose | | ---------------- | -------------------------------- | | First line | Brief description (shown to LLM) | | Second paragraph | When to use this tool | | Args | Parameter descriptions | ## Parameter Types Type hints define the schema the LLM uses to pass arguments. ```python @function_tool() def example( self, text: str, # String count: int, # Integer price: float, # Decimal enabled: bool, # Boolean items: list, # Array config: dict, # Object priority: str = "normal" # Optional with default ): """Example with various types.""" pass ``` ### Enum Parameters Constrain values with `Literal` from `typing`. ```python from typing import Literal @function_tool() def set_priority(self, ticket_id: str, level: Literal["low", "medium", "high"]): """Set ticket priority. Args: ticket_id: Ticket ID level: Priority level """ return self.tickets.update(ticket_id, level) ``` ## Async Tools Prefix with `async def` for database calls, HTTP requests, or other I/O. ```python @function_tool() async def search_database(self, query: str): """Search the customer database. Args: query: Customer name or email """ results = await self.db.search(query) return [{"name": r.name, "email": r.email} for r in results] ``` ## Accessing Session State Tools can read `self.` attributes for user data, DB connections, etc. ```python class MyAgent(OutputAgentNode): def __init__(self): super().__init__(name="my-agent") self.db = Database() self.user_tier = "standard" @function_tool() def get_discount(self, product_id: str): """Get available discount for a product. Args: product_id: Product ID """ # Access agent state if self.user_tier == "premium": return {"discount": 20} return {"discount": 0} ``` ## Error Handling Return `{"error": "message"}` so the LLM can respond gracefully. ```python @function_tool() def get_order(self, order_id: str): """Get order details. Args: order_id: Order ID like "ORD-12345" """ try: order = self.db.get_order(order_id) if not order: return {"error": "Order not found"} return order.to_dict() except Exception as e: return {"error": "Unable to retrieve order"} ``` The LLM will say something like "I couldn't find that order" instead of the conversation breaking. *** ## Tips The LLM reads tool descriptions every turn. Shorter = fewer tokens = faster. Good: `get_order_status`, `create_appointment`, `search_products`. Bad: `order`, `handle_booking`. Tell the LLM when to use tools: "Use get\_order\_status when the user asks about an order."