Skip to Content
Anuma SDKReactHooksuseChatStorage

useChatStorage

useChatStorage(options: object): UseChatStorageResult

Defined in: src/react/useChatStorage.ts:787 

A React hook that wraps useChat with automatic message persistence using WatermelonDB.

This hook provides all the functionality of useChat plus automatic storage of messages and conversations to a WatermelonDB database. Messages are automatically saved when sent and when responses are received.

Parameters

ParameterTypeDescription

options

object

Configuration options

options.apiType?

ApiType

Which API endpoint to use. Default: “responses”

  • “responses”: OpenAI Responses API (supports thinking, reasoning, conversations)
  • “completions”: OpenAI Chat Completions API (wider model compatibility)

options.autoCreateConversation?

boolean

Automatically create a new conversation if none is set (default: true)

options.autoEmbedMessages?

boolean

Automatically generate embeddings for messages after saving. Enables semantic search over past conversations via searchMessages().

Default

true

options.autoFlushOnKeyAvailable?

boolean

Automatically flush queued operations when the encryption key becomes available. Requires enableQueue to be true.

Default

true

options.baseUrl?

string

Base URL for the chat API endpoint

options.conversationId?

string

ID of an existing conversation to load and continue

options.database

Database

WatermelonDB database instance for storing conversations and messages

options.defaultConversationTitle?

string

Title for auto-created conversations (default: “New conversation”)

options.embeddedWalletSigner?

EmbeddedWalletSignerFn

Function for silent signing with Privy embedded wallets. When provided, enables automatic encryption key derivation without user confirmation modals.

options.embeddingModel?

string

Embedding model to use when autoEmbedMessages is enabled.

Default

DEFAULT_API_EMBEDDING_MODEL

options.enableQueue?

boolean

Enable the in-memory write queue for operations when encryption key isn’t yet available. When enabled, operations are held in memory and flushed to encrypted storage once the key becomes available.

Default

true

options.fileProcessingOptions?

{ keepOriginalFiles?: boolean; maxFileSizeBytes?: number; onError?: (fileName: string, error: Error) => void; onProgress?: (current: number, total: number, fileName: string) => void; }

Options for file preprocessing behavior

options.fileProcessingOptions.keepOriginalFiles?

boolean

Whether to keep original file attachments (default: true)

options.fileProcessingOptions.maxFileSizeBytes?

number

Max file size to process in bytes (default: 10MB)

options.fileProcessingOptions.onError?

(fileName: string, error: Error) => void

Callback for errors (non-fatal)

options.fileProcessingOptions.onProgress?

(current: number, total: number, fileName: string) => void

Callback for progress updates

options.fileProcessors?

FileProcessor[] | null

File preprocessors to use for automatic text extraction.

  • undefined (default): Use all built-in processors (PDF, Excel, Word)
  • null or []: Disable preprocessing
  • FileProcessor[]: Use specific processors

options.getToken?

() => Promise<string | null>

Function to retrieve the auth token for API requests

options.getWalletAddress?

() => Promise<string | null>

Async function that returns the wallet address when available. Used for polling during Privy embedded wallet initialization. When the wallet isn’t ready yet, should return null.

options.mcpR2Domain?

string

R2 domain for identifying MCP-generated image URLs. When set, enables OPFS caching of generated images. Defaults to the hardcoded MCP_R2_DOMAIN from clientConfig.

options.minContentLength?

number

Minimum content length required to generate embeddings. Messages shorter than this are skipped as they provide limited semantic value.

Default

10

options.onData?

(chunk: string) => void

Callback invoked with each streamed response chunk

options.onError?

(error: Error) => void

Callback invoked when an error occurs during the request

options.onFinish?

(response: LlmapiResponseResponse) => void

Callback invoked when the response completes successfully

options.onServerToolCall?

(toolCall: ServerToolCallEvent) => void

Callback invoked when a server-side tool (MCP) is called during streaming. Use this to show activity indicators like “Searching…” in the UI.

options.onThinking?

(chunk: string) => void

Callback invoked when thinking/reasoning content is received (from <think> tags or API reasoning)

options.onToolCallArgumentsDelta?

(event: ToolCallArgumentsDeltaEvent) => void

Called with partial tool call arguments as they stream in. Use for live preview of artifacts (HTML, slides) being generated.

options.serverTools?

{ cacheExpirationMs?: number; }

Configuration for server-side tools fetching and caching. Server tools are fetched from /api/v1/tools and cached in localStorage.

options.serverTools.cacheExpirationMs?

number

Cache expiration time in milliseconds (default: 86400000 = 1 day)

options.signMessage?

SignMessageFn

Function to sign a message for encryption key derivation. Typically from Privy’s useSignMessage hook. Required together with walletAddress for field-level encryption.

options.walletAddress?

string

Wallet address for encrypted file storage and field-level encryption. When provided with signMessage, all sensitive message content, conversation titles, and media metadata are encrypted at rest using AES-GCM with wallet-derived keys.

Requires:

  • OPFS browser support (for file storage)
  • signMessage function (for encryption key derivation)

When not provided, data is stored in plaintext (backwards compatible).

Returns

UseChatStorageResult

An object containing chat state, methods, and storage operations

Example

import { Database } from '@nozbe/watermelondb'; import { useChatStorage } from '@anuma/sdk/react'; function ChatComponent({ database }: { database: Database }) { const { isLoading, sendMessage, conversationId, getMessages, createConversation, } = useChatStorage({ database, getToken: async () => getAuthToken(), onData: (chunk) => setResponse((prev) => prev + chunk), }); const handleSend = async () => { const result = await sendMessage({ content: 'Hello, how are you?', model: 'gpt-4o-mini', includeHistory: true, // Include previous messages from this conversation }); if (result.error) { console.error('Error:', result.error); } else { console.log('User message stored:', result.userMessage); console.log('Assistant message stored:', result.assistantMessage); } }; return ( <div> <button onClick={handleSend} disabled={isLoading}>Send</button> </div> ); }
Last updated on