MCP Server
Connect AI agents to your knowledge base via the Model Context Protocol
Server Status
How It Works
The Blogmarks MCP server gives AI agents direct access to your personal knowledge base. Save a bookmark, the ingestion pipeline extracts and indexes it, and you can instantly ask questions about it from any MCP-compatible AI client.
Available Tools
Example Prompts
Once the MCP server is connected to your AI client (e.g. Claude Desktop), try asking things like:
Search & Retrieve
- ›"What have I saved about Rust async programming?"
- ›"Find my notes on React Server Components"
- ›"Search for articles about database indexing strategies"
- ›"What do I know about WebAssembly runtimes?"
Browse Assets
- ›"List the last 10 articles I bookmarked"
- ›"Show me all my saved assets with their titles and dates"
- ›"What bookmarks do I have from last month?"
Entity Discovery
- ›"Which technologies appear most often across my bookmarks?"
- ›"Find all mentions of Anthropic in my knowledge base"
- ›"Which people are frequently mentioned in my saved articles?"
- ›"Show me concepts that appear in at least 3 articles"
Deep Dives
- ›"Get the full content of asset <asset_id> and summarise it"
- ›"Read my bookmark about <topic> and compare it to what I know"
- ›"What's the connection between <entity A> and <entity B> in my bookmarks?"
Setup Guide
Add the following to your Claude Desktop config file to connect the Blogmarks MCP server:
~/.config/claude/claude_desktop_config.json
{
"mcpServers": {
"blogmarks": {
"command": "/path/to/blogmarks-mcp",
"env": {
"QDRANT_URL": "http://localhost:6333",
"OPENAI_API_KEY": "sk-...",
"POD_BASE_PATH": "/path/to/pod/storage"
}
}
}
}QDRANT_URL — URL of your local Qdrant vector store (default: http://localhost:6333)
OPENAI_API_KEY — OpenAI API key used for embedding queries during search
POD_BASE_PATH — Filesystem path to your Solid Pod storage directory
