Today we're excited to announce two powerful additions to the Model Context Protocol (MCP) ecosystem: PydanticAI Agents as MCP clients, and the Logfire MCP server. These tools bring enhanced capabilities to AI applications through standardized integrations.
PydanticAI MCP Support
PydanticAI now supports the Model Context Protocol in three key ways:
- Agents as MCP Clients: PydanticAI agents can connect to MCP servers to leverage their tools and capabilities
- Agents in MCP Servers: PydanticAI agents can be used within MCP servers
- mcp-run-python: As part of our ecosystem, we're building specialized MCP servers, including mcp-run-python
Learn more on the PydanticAI docs.
Logfire MCP Server
Our latest MCP server implementation is for our Logfire observability platform. This MCP server enables AI applications to search and analyze logs, traces, and metrics directly. This powerful integration helps debugging workflows by providing AI assistants with crucial observability context. Because Logfire allows you to query your data using SQL, it is particularly well-suited to AI agent support.
Setting Up Logfire MCP Server in Cursor
Here's a video demo showing how you can query your Logfire data in Cursor using the Logfire MCP server:
This enables working with Cursor's coding agent (or any other IDE that supports MCP) for:
- AI-driven fixes based on issues found in your live applications
- Identification of bottlenecks in your code based on live data
- Easy querying and understanding of your errors
And many more!
Getting started with the Logfire MCP server in Cursor is straightforward. Create or update your .cursor/mcp.json
as per our setup instructions,
make sure you select agent
mode in the Cursor chat, and then your AI agents will start using Logfire!
What is MCP?
The Model Context Protocol provides a standardized interface for AI applications to connect with external tools and services. This common interface enables applications like PydanticAI agents, coding assistants like Cursor, and desktop apps like Claude Desktop to seamlessly interact with various services.
The MCP ecosystem is growing rapidly, with many implementations available at github.com/modelcontextprotocol/servers. Samuel and Marcelo from Pydantic are helping to maintain the official Python SDK for MCP servers and clients.
Use Cases
- Create deep research agents by connecting PydanticAI to web search MCP servers
- Help Cursor (or Windsurf, Claude Code etc.) debug complex issues by connecting to Logfire's MCP server for observability context
- Run sandboxed Python code through Logfire's Run Python MCP server from any MCP client
If you've got any questions or suggestions, please let us know.