Your critical info is scattered across tools that don't talk to each other. Your AI conversation starts with "let me give you some context." Your experiences and learnings are still in your head and your head doesn't scale.
CORE remembers. Not a database. Not a search box. A digital brain that learns what matters, connects what's related, and surfaces what you need.
CORE gives your AI tools persistent memory and the ability to act in the apps you use.
- Context preserved across Claude Code, Cursor and other coding agents
- Take actions in Linear, GitHub, Slack, Gmail, Google Sheets and other apps you use
- Connect once via MCP, works everywhere
- Open-source and self-hostable; your data, your control
CORE becomes your persistent memory layer for coding agents. Ask any AI tool to pull relevant context.
Search core memory for architecture decisions on the payment serviceWhat my content guidelines from core to create the blog?Connect your apps once, take actions from anywhere.
- Create/Read GitHub, Linear issues
- Draft/Send/Read an email and store relevant info in CORE
- Manage your calendar, update spreadsheet
Switching back to a feature after a week? Get caught up instantly.
What did we discuss about the checkout flow? Summarize from memory.Refer to past discussions and remind me where we left off on the API refactor-
Temporal Context Graph: CORE doesn't just store facts — it remembers the story. When things happened, how your thinking evolved, what led to each decision. Your preferences, goals, and past choices — all connected in a graph that understands sequence and context.
-
88.24% Recall Accuracy: Tested on the LoCoMo benchmark. When you ask CORE something, it finds what's relevant. Not keyword matching, true semantic understanding with multi-hop reasoning.
-
You Control It: Your memory, your rules. Edit what's wrong. Delete what doesn't belong. Visualize how your knowledge connects. CORE is transparent, you see exactly what it knows.
-
Open Source: No black boxes. No vendor lock-in. Your digital brain belongs to you.
Choose your path:
| CORE Cloud | Self-Host | |
|---|---|---|
| Setup time | 5 minutes | 15 minutes |
| Best for | Try quickly, no infra | Full control, your servers |
| Requirements | Just an account | Docker, 4GB RAM |
- Sign up at app.getcore.me
- Connect a source (Claude, Cursor, or any MCP-compatible tool)
- Start using CORE to perform any action or store about you in memory
Quick Deploy
Or with Docker
- Clone the repository:
git clone https://github.com/RedPlanetHQ/core.git
cd core
- Configure environment variables in
core/.env:
OPENAI_API_KEY=your_openai_api_key
- Start the service
docker-compose up -d
Once deployed, you can configure your AI providers (OpenAI, Anthropic) and start building your memory graph.
👉 View complete self-hosting guide
Note: We tried open-source models like Ollama or GPT OSS but facts generation were not good, we are still figuring out how to improve on that and then will also support OSS models.
Install in Claude Code CLI
- Run this command in your terminal to connect CORE with Claude Code:
claude mcp add --transport http --scope user core-memory https://mcp.getcore.me/api/v1/mcp?source=Claude-Code- Type
/mcpand open core-memory MCP for authentication
Install in Cursor
Since Cursor 1.0, you can click the install button below for instant one-click installation.
OR
- Go to:
Settings->Tools & Integrations->Add Custom MCP - Enter the below in
mcp.jsonfile:
{
"mcpServers": {
"core-memory": {
"url": "https://mcp.getcore.me/api/v1/mcp?source=cursor",
"headers": {}
}
}
}Install in Claude Desktop
- Copy CORE MCP URL:
https://mcp.getcore.me/api/v1/mcp?source=Claude
- Navigate to Settings → Connectors → Click Add custom connector
- Click on "Connect" and grant Claude permission to access CORE MCP
Install in Codex CLI
Option 1 (Recommended): Add to your ~/.codex/config.toml file:
[features]
rmcp_client=true
[mcp_servers.memory]
url = "https://mcp.getcore.me/api/v1/mcp?source=codex"Then run: codex mcp memory login
Option 2 (If Option 1 doesn't work): Add API key configuration:
[features]
rmcp_client=true
[mcp_servers.memory]
url = "https://mcp.getcore.me/api/v1/mcp?source=codex"
http_headers = { "Authorization" = "Bearer CORE_API_KEY" }Get your API key from app.getcore.me → Settings → API Key, then run: codex mcp memory login
Install in Gemini CLI
See Gemini CLI Configuration for details.
- Open the Gemini CLI settings file. The location is
~/.gemini/settings.json(where~is your home directory). - Add the following to the
mcpServersobject in yoursettings.jsonfile:
{
"mcpServers": {
"corememory": {
"httpUrl": "https://mcp.getcore.me/api/v1/mcp?source=geminicli",
"timeout": 5000
}
}
}If the mcpServers object does not exist, create it.
Install in Copilot CLI
Add the following to your ~/.copilot/mcp-config.json file:
{
"mcpServers": {
"core": {
"type": "http",
"url": "https://mcp.getcore.me/api/v1/mcp?source=Copilot-CLI",
"headers": {
"Authorization": "Bearer YOUR_API_KEY"
}
}
}
}Install in VS Code
Enter the below in mcp.json file:
{
"servers": {
"core-memory": {
"url": "https://mcp.getcore.me/api/v1/mcp?source=Vscode",
"type": "http",
"headers": {
"Authorization": "Bearer YOUR_API_KEY"
}
}
}
}Install in VS Code Insiders
Add to your VS Code Insiders MCP config:
{
"mcp": {
"servers": {
"core-memory": {
"type": "http",
"url": "https://mcp.getcore.me/api/v1/mcp?source=VSCode-Insiders",
"headers": {
"Authorization": "Bearer YOUR_API_KEY"
}
}
}
}
}Install in Windsurf
Enter the below in mcp_config.json file:
{
"mcpServers": {
"core-memory": {
"serverUrl": "https://mcp.getcore.me/api/v1/mcp/source=windsurf",
"headers": {
"Authorization": "Bearer <YOUR_API_KEY>"
}
}
}
}Install in Zed
- Go to
Settingsin Agent Panel ->Add Custom Server - Enter below code in configuration file and click on
Add serverbutton
{
"core-memory": {
"command": "npx",
"args": ["-y", "mcp-remote", "https://mcp.getcore.me/api/v1/mcp?source=Zed"]
}
}Install in Amp
Run this command in your terminal:
amp mcp add core-memory https://mcp.getcore.me/api/v1/mcp?source=ampInstall in Augment Code
Add to your ~/.augment/settings.json file:
{
"mcpServers": {
"core-memory": {
"type": "http",
"url": "https://mcp.getcore.me/api/v1/mcp?source=augment-code",
"headers": {
"Authorization": "Bearer YOUR_API_KEY"
}
}
}
}Install in Cline
- Open Cline and click the hamburger menu icon (☰) to enter the MCP Servers section
- Choose Remote Servers tab and click the Edit Configuration button
- Add the following to your Cline MCP configuration:
{
"mcpServers": {
"core-memory": {
"url": "https://mcp.getcore.me/api/v1/mcp?source=Cline",
"type": "streamableHttp",
"headers": {
"Authorization": "Bearer YOUR_API_KEY"
}
}
}
}Install in Kilo Code
- Go to
Settings→MCP Servers→Installed tab→ clickEdit Global MCPto edit your configuration. - Add the following to your MCP config file:
{
"core-memory": {
"type": "streamable-http",
"url": "https://mcp.getcore.me/api/v1/mcp?source=Kilo-Code",
"headers": {
"Authorization": "Bearer your-token"
}
}
}Install in Kiro
Add in Kiro → MCP Servers:
{
"mcpServers": {
"core-memory": {
"url": "https://mcp.getcore.me/api/v1/mcp?source=Kiro",
"headers": {
"Authorization": "Bearer YOUR_API_KEY"
}
}
}
}Install in Qwen Coder
See Qwen Coder MCP Configuration for details.
Add to ~/.qwen/settings.json:
{
"mcpServers": {
"core-memory": {
"httpUrl": "https://mcp.getcore.me/api/v1/mcp?source=Qwen",
"headers": {
"Authorization": "Bearer YOUR_API_KEY",
"Accept": "application/json, text/event-stream"
}
}
}
}Install in Roo Code
Add to your Roo Code MCP configuration:
{
"mcpServers": {
"core-memory": {
"type": "streamable-http",
"url": "https://mcp.getcore.me/api/v1/mcp?source=Roo-Code",
"headers": {
"Authorization": "Bearer YOUR_API_KEY"
}
}
}
}Install in Opencode
Add to your Opencode configuration:
{
"mcp": {
"core-memory": {
"type": "remote",
"url": "https://mcp.getcore.me/api/v1/mcp?source=Opencode",
"headers": {
"Authorization": "Bearer YOUR_API_KEY"
},
"enabled": true
}
}
}Install in Copilot Coding Agent
Add to Repository Settings → Copilot → Coding agent → MCP configuration:
{
"mcpServers": {
"core": {
"type": "http",
"url": "https://mcp.getcore.me/api/v1/mcp?source=Copilot-Agent",
"headers": {
"Authorization": "Bearer YOUR_API_KEY"
}
}
}
}Install in Qodo Gen
- Open Qodo Gen chat panel in VSCode or IntelliJ
- Click Connect more tools, then click + Add new MCP
- Add the following configuration:
{
"mcpServers": {
"core-memory": {
"url": "https://mcp.getcore.me/api/v1/mcp?source=Qodo-Gen"
}
}
}Install in Warp
Add in Settings → AI → Manage MCP servers:
{
"core": {
"url": "https://mcp.getcore.me/api/v1/mcp?source=Warp",
"headers": {
"Authorization": "Bearer YOUR_API_KEY"
}
}
}Install in Crush
Add to your Crush configuration:
{
"$schema": "https://charm.land/crush.json",
"mcp": {
"core": {
"type": "http",
"url": "https://mcp.getcore.me/api/v1/mcp?source=Crush",
"headers": {
"Authorization": "Bearer YOUR_API_KEY"
}
}
}
}Install in ChatGPT
Connect ChatGPT to CORE's memory system via browser extension:
- Install Core Browser Extension
- Generate API Key: Go to Settings → API Key → Generate new key → Name it "extension"
- Add API Key in Core Extension and click Save
Install in Gemini
Connect Gemini to CORE's memory system via browser extension:
- Install Core Browser Extension
- Generate API Key: Go to Settings → API Key → Generate new key → Name it "extension"
- Add API Key in Core Extension and click Save
Install in Perplexity Desktop
- Add in Perplexity → Settings → Connectors → Add Connector → Advanced:
{
"core-memory": {
"command": "npx",
"args": ["-y", "mcp-remote", "https://mcp.getcore.me/api/v1/mcp?source=perplexity"]
}
}- Click Save to apply the changes
- Core will be available in your Perplexity sessions
Install in Factory
Run in terminal:
droid mcp add core https://mcp.getcore.me/api/v1/mcp?source=Factory --type http --header "Authorization: Bearer YOUR_API_KEY"Type /mcp within droid to manage servers and view available tools.
Install in Rovo Dev CLI
- Edit mcp config:
acli rovodev mcp- Add to your Rovo Dev MCP configuration:
{
"mcpServers": {
"core-memory": {
"url": "https://mcp.getcore.me/api/v1/mcp?source=Rovo-Dev"
}
}
}Install in Trae
Add to your Trae MCP configuration:
{
"mcpServers": {
"core": {
"url": "https://mcp.getcore.me/api/v1/mcp?source=Trae"
}
}
}CORE Memory MCP provides the following tools that LLMs can use:
memory_search: Search relevant context from CORE Memory.memory_ingest: Add an episode in CORE Memory.memory_about_user: Fetches user persona from CORE Memory.initialise_conversation_session: Initialise conversation and assign session id to a conversation.get_integrations: Fetches what relevant integration should be used from the connected integrations.get_integrations_actions: Fetches what tool to be used from that integrations tools for the task.execute_integrations_actions: Execute the tool for that integration .
When you save context to CORE, it goes through four phases:
- Normalization: Links new info to recent context, breaks documents into coherent chunks while keeping cross-references
- Extraction: Identifies entities (people, tools, projects), creates statements with context and time, maps relationships
- Resolution: Detects contradictions, tracks how preferences evolve, preserves multiple perspectives with provenance
- Graph Integration: Connects entities, statements, and episodes into a temporal knowledge graph
Example: "We wrote CORE in Next.js" becomes:
- Entities:
CORE,Next.js - Statement:
CORE was developed using Next.js - Relationship:
was developed using
When you query CORE:
- Search: Hybrid approach: keyword + semantic + graph traversal
- Re-rank: Surfaces most relevant and diverse results
- Filter: Applies time, reliability, and relationship strength filters
- Output: Returns facts AND the episodes they came from
CORE doesn't just recall facts — it recalls them in context, with time and story, so agents respond the way you would remember.
Building AI agents? CORE gives you memory infrastructure + integrations infrastructure so you can focus on your agent's logic.
Memory Infrastructure
- Temporal knowledge graph with 88.24% LoCoMo accuracy
- Hybrid search: semantic + keyword + graph traversal
- Tracks context evolution and contradictions
Integrations Infrastructure
- Connect GitHub, Linear, Slack, Gmail once
- Your agent gets MCP tools for all connected apps
- No OAuth flows to build, no API maintenance
core-cli — A task manager agent that connects to CORE for memory and syncs with Linear, GitHub Issues.
holo — Turn your CORE memory into a personal website with chat.
- API Reference
- SDK Documentation
- Need a specific integration? Open a GitHub issue
CORE memory achieves 88.24% average accuracy in Locomo dataset across all reasoning tasks, significantly outperforming other memory providers.
| Task Type | Description |
|---|---|
| Single-hop | Answers based on a single session |
| Multi-hop | Synthesizing info from multiple sessions |
| Open-domain | Integrating user info with external knowledge |
| Temporal reasoning | Time-related cues and sequence understanding |
View benchmark methodology and results →
CASA Tier 2 Certified — Third-party audited to meet Google's OAuth requirements.
- Encryption: TLS 1.3 (transit) + AES-256 (rest)
- Authentication: OAuth 2.0 and magic link
- Access Control: Workspace-based isolation, role-based permissions
- Zero-trust architecture: Never trust, always verify
Your data, your control:
- Edit and delete anytime
- Never used for AI model training
- Self-hosting option for full isolation
For detailed security information, see our Security Policy.
Vulnerability Reporting: harshith@poozle.dev
Explore our documentation to get the most out of CORE
Have questions or feedback? We're here to help:
- Discord: Join core-support channel
- Documentation: docs.getcore.me
- Email: manik@poozle.dev
Store:
- Conversation history
- User preferences
- Task context
- Reference materials
Don't Store:
- Sensitive data (PII)
- Credentials
- System logs
- Temporary data



