Anuma Starter Agent
An interactive CLI chat agent built with the Anuma SDK . Supports streaming responses, model switching, and client-side tool execution via the SDK’s runToolLoop.
Getting Started
Create an Anuma app
Sign in at dashboard.anuma.ai and create an app. This provisions the API account that powers AI responses.
Clone and install
git clone https://github.com/anuma-ai/starter-agent.git
cd starter-agent
pnpm installSave your API key
Copy the API key from the dashboard and save it:
pnpm agent login --api-key <your-key>Usage
Start a chat session:
pnpm agent chatOptions:
--model <name> Model to use (default: "openai/gpt-4o")
--system <prompt> System prompt
--api-url <url> API base URL
--no-tools Disable client-side toolsChat commands
/model— open the model picker (fuzzy search)/model <name>— switch to a model by name/exit— quit the session
Adding tools
Tools live in src/tools/. Each tool is a ToolConfig object with a function schema and an executor that runs locally when the model calls it.
Create a new file in src/tools/:
import type { ToolConfig } from "@anuma/sdk/server";
export const myTool: ToolConfig = {
type: "function",
function: {
name: "my_tool",
description: "What this tool does",
parameters: {
type: "object",
properties: {
arg: { type: "string", description: "Argument description" },
},
required: ["arg"],
},
},
executor: async ({ arg }) => {
// Your logic here — runs on the user's machine
return { result: "..." };
},
};Then register it in src/tools/index.ts:
import { myTool } from "./my-tool.js";
export const tools: ToolConfig[] = [listFiles, myTool];The included list_files tool demonstrates this pattern — it reads the local filesystem, something only a client-side tool can do.
Build
pnpm buildAfter building, the CLI is available as anuma-agent (via the bin field in package.json).