MCP Server Integration
The Hive framework provides native support for the Model Context Protocol (MCP), allowing your agents to instantly access a vast ecosystem of standardized tools, data sources, and resources. By integrating MCP servers, you can extend your agent's capabilities—such as filesystem access, web searching, or database interaction—without writing custom wrapper code for every utility.
Overview
In Hive, MCP servers are treated as external tool providers. The ToolRegistry acts as the bridge, connecting to these servers and surfacing their functions as standard Hive tools that can be utilized by any node in your agent's graph.
Configuring MCP Servers
To connect your agent to MCP servers, you define a configuration file (typically mcp_servers.json). This file specifies how to launch and communicate with each server.
Configuration Format
The configuration follows the standard MCP client schema. You can define multiple servers, each with its own environment variables and arguments.
{
"mcpServers": {
"filesystem": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "/path/to/workspace"],
"env": {
"NODE_ENV": "production"
}
},
"brave-search": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-brave-search"],
"env": {
"BRAVE_API_KEY": "your_api_key_here"
}
}
}
}
Integrating with the Tool Registry
The ToolRegistry is responsible for loading and managing MCP-based tools. You can register these tools during the agent's initialization phase.
Loading from Config
Use the load_mcp_config() method to point the registry to your configuration file. This automatically initializes the servers and registers their exported tools.
from pathlib import Path
from framework.runner.tool_registry import ToolRegistry
# Initialize the registry
tool_registry = ToolRegistry()
# Load MCP servers from a JSON configuration
mcp_config_path = Path("mcp_servers.json")
if mcp_config_path.exists():
tool_registry.load_mcp_config(mcp_config_path)
# Retrieve all registered tools (including MCP tools)
tools = list(tool_registry.get_tools().values())
Developing Custom MCP Tools
Hive is built to work seamlessly with FastMCP (a high-level Python framework for MCP). This allows you to write your own specialized servers and integrate them into Hive agents.
Example: Custom File Viewer
If you are building a custom toolkit within the Hive ecosystem, you can register tools directly to an MCP server instance:
from mcp.server.fastmcp import FastMCP
def register_tools(mcp: FastMCP) -> None:
@mcp.tool()
def view_file(path: str, session_id: str) -> dict:
"""Read the content of a file within the session sandbox."""
try:
# Logic for secure path resolution and file reading
with open(path, 'r') as f:
return {"content": f.read(), "success": True}
except Exception as e:
return {"error": str(e), "success": False}
Runtime Usage in Agents
Once registered, MCP tools are indistinguishable from native Hive tools. They are passed to the AgentRuntime and can be invoked by LLMNode or EventLoopNode based on the tool's docstring and arguments.
from framework.runtime.agent_runtime import create_agent_runtime
runtime = create_agent_runtime(
graph=my_graph,
tools=tools, # Contains both native and MCP tools
tool_executor=tool_registry.get_executor(),
# ... other configurations
)
Tool Execution Flow
- Discovery: The agent's LLM identifies an MCP tool as the best fit for the current task.
- Invocation: Hive's
tool_executorroutes the request to the specific MCP server. - Execution: The MCP server runs the tool (e.g., searches the web) and returns the JSON result.
- Processing: The result is fed back into the agent's context for the next step in the graph.
Security and Sandboxing
When using MCP tools that interact with the local environment (like view_file or shell_execute), Hive recommends:
- Workspace Isolation: Pass specific workspace IDs and session IDs to the MCP tools to restrict access to authorized directories.
- Credential Management: Use the built-in
CredentialStoreto securely inject API keys (likeBRAVE_API_KEY) into the MCP server's environment configuration rather than hardcoding them in JSON.