The AT Protocol MCP Server bridges AI assistants with Bluesky and the decentralized social web. It provides zero-configuration public access for reading, with optional OAuth for authenticated operations.
MCP Tool Explorer
{
"tool": "get_profile",
"handle": "bsky.app"
}Fetch a user profile by handle or DID
This is a simulated demo. The actual MCP server processes requests from AI assistants like Claude.
Key Features
- Zero Configuration: Immediate access to public AT Protocol data
- Full Protocol Coverage: Posts, profiles, feeds, and social graph
- OAuth Support: Secure authentication for write operations
- Production Ready: Docker, Kubernetes, and enterprise deployment support
Why this project mattered
The interesting part of AT Protocol is not just that it powers Bluesky. It is that the protocol is decentralized, strongly typed, and split across multiple services with different responsibilities. That makes it powerful for developers, but awkward for LLMs and agent frameworks that need a stable interface.
This project turned that complexity into a predictable MCP surface area. Instead of asking an LLM to understand handles, DIDs, AppView reads, PDS writes, and OAuth on its own, the server exposes those capabilities as discoverable tools with clear inputs and outputs.
Architecture decisions
The most important design choice was separating read-only and authenticated operations.
- Public tools route through the public AT Protocol APIs so an assistant can explore profiles, feeds, and search results immediately.
- Authenticated tools are isolated behind OAuth so write access is explicit and scoped.
- The MCP layer keeps protocol vocabulary intact enough for power users, while still normalizing the shape of requests for agent use.
That split makes the project useful in two modes: instant exploration with zero setup, and production workflows where identity and write access matter.
Tradeoffs and implementation lessons
One of the main tradeoffs was how much protocol detail to hide. Abstract too much, and the tool becomes misleading. Expose too much, and the tool stops being ergonomic. I aimed for a middle path: tool names and parameters reflect the underlying network model, but the server handles the plumbing around discovery, authentication, and response shaping.
That approach made it easier to support both hobbyist experimentation and more serious deployment scenarios such as hosted MCP servers, containers, and Kubernetes-based setups.
Outcome
The result is a project that demonstrates protocol fluency, API design judgment, and product thinking at the same time. It is not just a wrapper around Bluesky endpoints. It is a translation layer that makes a decentralized protocol practical inside modern AI workflows.
Was this helpful?
Want to learn more?
Ask can answer questions about this project's implementation, technologies, and more.