Now in public beta

The cognitive fabric for AI agents

Memory, knowledge graphs, prompt management, and meta-learning in one API. Drop Cerebe into your LangGraph agent and give it a brain.

$pip install cerebe
PythonTypeScript
from cerebe import AsyncCerebe

client = AsyncCerebe(api_key="ck_live_...")

# Store a memory
await client.memory.add(
    content="User prefers visual explanations",
    user_id="user_123",
    session_id="session_abc",
)

# Search memories
results = await client.memory.search(
    query="What does the user prefer?",
    session_id="session_abc",
)

Platform

The cognitive fabric for AI agents

Six services, one API. Everything your AI needs to remember, reason, and learn — ready for LangGraph, CrewAI, or any agent framework.

Core

Memory Fabric

Hybrid vector + graph memory that persists across sessions. Your AI genuinely remembers users, preferences, and context — not just within a conversation, but forever.

Core

Knowledge Graph

Temporal knowledge graphs powered by Graphiti. Entities and relationships that evolve over time with full provenance. Query what was true at any point in history.

Intelligence

Memory-Aware LLM Router

OpenAI-compatible chat completions automatically enriched with relevant memories, knowledge, and cognitive context. Every response is informed by everything your AI knows.

DevEx

Prompt Service

Version-controlled prompt templates with variable substitution, A/B evaluation via Promptfoo, and domain-aware enrichment. Manage prompts as code, deploy without redeploying.

Intelligence

Meta-Learning (PLRE)

Detects cognitive patterns, engagement levels, and learning velocity. The PLRE framework (Prepare-Learn-Reinforce-Evaluate) adapts your AI to how each user thinks.

Platform

Agentic Services Fabric

Everything your LangGraph, CrewAI, or custom agent needs — memory, knowledge, prompts, and cognitive state — in one API. Drop in Cerebe and your agent gets a brain.

Built for developers

Identical API surface in Python and TypeScript. Copy, paste, ship.

from cerebe import AsyncCerebe

client = AsyncCerebe(api_key="ck_live_...")

# Store a memory
await client.memory.add(
    content="User prefers visual explanations",
    user_id="user_123",
    session_id="session_abc",
)

# Search across all memories
results = await client.memory.search(
    query="What does the user prefer?",
    session_id="session_abc",
)

# Query the knowledge graph
entities = await client.knowledge.query(
    query="relationships between user and algebra",
    depth=2,
)

# Analyze learning patterns
patterns = await client.meta_learning.analyze(
    user_id="user_123",
    window="7d",
)

Simple, transparent pricing

Start free. Scale as you grow. No credit card required.

Free

$0forever

For exploration and prototyping

  • 1,000 memory ops/month
  • 100 knowledge queries/month
  • 10K LLM tokens/month
  • 1 project
  • Community support
Get Started Free

Starter

$49/month

For indie developers and small teams

  • 50,000 memory ops/month
  • 5,000 knowledge queries/month
  • 500K LLM tokens/month
  • 5 projects
  • Email support
  • 99.5% SLA
Start Free Trial
Most Popular

Pro

$249/month

For growing products with real users

  • 500,000 memory ops/month
  • 50,000 knowledge queries/month
  • 5M LLM tokens/month
  • 20 projects
  • Priority support
  • 99.9% SLA
  • Custom rate limits
Start Free Trial

Enterprise

Custom

Dedicated infrastructure and support

  • Unlimited everything
  • Dedicated infrastructure
  • Custom models
  • SSO / SAML
  • Dedicated support engineer
  • 99.99% SLA
  • BAA / DPA available
Contact Sales