Start building with one prompt
Copy a prompt, paste it into your AI coding tool, and get a working Cerebe integration in minutes. Each prompt points your AI to our full documentation automatically.
1Get your API key
→2Copy a prompt below
→3Paste into your AI tool
Claude Code
Paste into your Claude Code terminal sessionI want to add persistent memory to my app using Cerebe. First, read the full API documentation: Fetch https://cerebe.ai/llms-full.txt Then: 1. Install the Cerebe Python SDK: pip install cerebe 2. Add memory storage that persists user preferences across sessions 3. Add semantic search to retrieve relevant memories 4. Use the async client (AsyncCerebe) for best performance My API key is in the CEREBE_API_KEY environment variable.
Cursor
Paste into Cursor Composer or chatI want to integrate Cerebe (cerebe.ai) for persistent AI memory in my project. Cerebe docs: https://cerebe.ai/llms-full.txt Install: npm install @cerebe/sdk (TypeScript) or pip install cerebe (Python) Please: 1. Read the docs URL above to understand the full API 2. Add Cerebe memory to my app — store user context and retrieve it semantically 3. Use the correct field names from the docs (the SDK matches the API exactly) 4. Set up the client with api key from CEREBE_API_KEY env var Key endpoints: memory.add(), memory.search(), memory.harvest() for extracting memories from conversations.
Lovable
Paste as your Lovable app promptBuild me a personal knowledge assistant app that uses Cerebe (cerebe.ai) as its memory backend. The app should: - Let users chat with an AI that remembers everything across sessions - Store conversation memories using Cerebe's memory API - Search past memories semantically when answering questions - Show a "memory timeline" sidebar of what the AI remembers Tech: Use the @cerebe/sdk npm package. API docs at https://cerebe.ai/llms-full.txt The Cerebe API key should come from an environment variable CEREBE_API_KEY. Key SDK methods: client.memory.add(), client.memory.search(), client.memory.harvest()
Windsurf / Bolt
Paste into Windsurf Cascade or Bolt promptAdd Cerebe cognitive memory to my project. Cerebe gives AI apps persistent memory, knowledge graphs, and meta-learning.
Full API docs: https://cerebe.ai/llms-full.txt
Setup:
- TypeScript: npm install @cerebe/sdk
- Python: pip install cerebe
Implement:
1. Initialize client: new Cerebe({ apiKey: process.env.CEREBE_API_KEY })
2. Store memories: client.memory.add({ content, sessionId, type: 'semantic' })
3. Search memories: client.memory.search({ query, sessionId })
4. Harvest from conversations: client.memory.harvest({ sessionId, transcript: [{role, content}] })
The SDK uses camelCase params that map to the API's snake_case automatically.ChatGPT / Custom GPT
Use as instructions for a Custom GPT with ActionsYou are a coding assistant that helps developers integrate Cerebe, a cognitive memory platform for AI applications. Cerebe API base URL: https://api.cerebe.ai Authentication: X-API-Key header Core endpoints: - POST /api/v1/memory/store — Store a memory (content, session_id, memory_type, importance) - POST /api/v1/memory/search — Search memories (query, session_id, limit) - POST /api/v1/memory/harvest — Extract memories from conversation transcript - POST /api/v1/knowledge/ingest — Add to knowledge graph - POST /api/v1/knowledge/query — Query knowledge graph SDKs: pip install cerebe (Python), npm install @cerebe/sdk (TypeScript) Full docs: https://cerebe.ai/llms-full.txt When helping users, always use the SDK rather than raw HTTP calls. Prefer async patterns.
For AI agents directly
Your AI tool can also fetch our documentation programmatically:
https://cerebe.ai/llms.txt
https://cerebe.ai/llms-full.txt
llms.txt = page index · llms-full.txt = complete documentation