MCP Tools
Model Context Protocol for extending LLM capabilities with function calling and tool execution.
Overview
MCP (Model Context Protocol) enables LLMs to use external tools and functions. ZSE includes built-in tools and supports custom tool definitions compatible with OpenAI's function calling format.
Function Calling
Parse and execute tool calls from LLM output
Built-in Tools
Calculator, datetime, JSON parser, string ops
OpenAI Compatible
Same format as OpenAI function calling
- JSON schema tool definitions
- Automatic tool call parsing from LLM output
- Built-in utility tools
- Custom tool registration
- OpenAI-compatible function calling format
Built-in Tools
ZSE includes several utility tools out of the box:
| Tool | Description | Example Input |
|---|---|---|
| calculator | Math expressions (sqrt, sin, cos, log, etc.) | sqrt(16) + 2**3 |
| datetime | Current date/time with timezone support | America/New_York |
| parse_json | Parse JSON and extract data | {"key": "value"} |
| string_ops | String operations (upper, lower, split, etc.) | upper: hello world |
Calculator
bash
# Execute calculator toolcurl -X POST http://localhost:8000/api/tools/execute \ -H "Content-Type: application/json" \ -d '{ "tool": "calculator", "input": "sqrt(144) + sin(3.14159/2)" }' # Response:# {"result": 13.0, "success": true}Supported functions: sqrt, sin, cos, tan, log, log10, exp, abs, round, floor, ceil
Datetime
bash
# Get current timecurl -X POST http://localhost:8000/api/tools/execute \ -H "Content-Type: application/json" \ -d '{ "tool": "datetime", "input": "UTC" }' # Response:# {# "result": {# "datetime": "2026-02-26T10:30:00+00:00",# "timezone": "UTC",# "unix_timestamp": 1772092200# },# "success": true# }Using Tools
List Available Tools
bash
# List all registered toolscurl http://localhost:8000/api/tools/ # Response:# {# "tools": [# {# "name": "calculator",# "description": "Evaluate mathematical expressions",# "parameters": {...}# },# ...# ]# }Execute Tool
python
import requests # Execute a tool directlyresponse = requests.post( "http://localhost:8000/api/tools/execute", json={ "tool": "calculator", "input": "2 ** 10" })print(response.json())# {"result": 1024, "success": true}Parse Tool Calls from Text
bash
# Parse tool calls from LLM outputcurl -X POST http://localhost:8000/api/tools/parse \ -H "Content-Type: application/json" \ -d '{ "text": "I need to calculate sqrt(256). Let me use the calculator tool." }' # Response:# {# "tool_calls": [# {"tool": "calculator", "input": "sqrt(256)"}# ]# }Parse and Execute
bash
# Parse and execute tool calls in one stepcurl -X POST http://localhost:8000/api/tools/process \ -H "Content-Type: application/json" \ -d '{ "text": "What time is it in Tokyo? Use datetime tool with Asia/Tokyo timezone." }' # Response:# {# "results": [# {# "tool": "datetime",# "input": "Asia/Tokyo",# "result": {"datetime": "2026-02-26T19:30:00+09:00", ...},# "success": true# }# ]# }Custom Tools
Register custom tools with JSON schema definitions:
python
from zse.api.server.mcp import MCPRegistry # Get the registryregistry = MCPRegistry() # Define a custom tool@registry.tool( name="weather", description="Get current weather for a location", parameters={ "type": "object", "properties": { "location": { "type": "string", "description": "City name or coordinates" }, "units": { "type": "string", "enum": ["celsius", "fahrenheit"], "default": "celsius" } }, "required": ["location"] })def get_weather(location: str, units: str = "celsius"): # Your implementation here return {"temperature": 22, "condition": "sunny", "units": units} # Now the tool is available via API# POST /api/tools/execute {"tool": "weather", "input": {"location": "Tokyo"}}Custom tools are persisted in the server session. For permanent tools, add them to your server startup script.
OpenAI Format
Get tools in OpenAI-compatible function calling format:
Get Functions
bash
# Get tools in OpenAI formatcurl http://localhost:8000/api/tools/openai/functions # Response:# {# "functions": [# {# "name": "calculator",# "description": "Evaluate mathematical expressions",# "parameters": {# "type": "object",# "properties": {# "expression": {# "type": "string",# "description": "Math expression to evaluate"# }# },# "required": ["expression"]# }# }# ]# }Chat with Tools
python
import requests # Get available toolstools_response = requests.get( "http://localhost:8000/api/tools/openai/functions")tools = tools_response.json()["functions"] # Chat with tool supportresponse = requests.post( "http://localhost:8000/v1/chat/completions", json={ "model": "qwen-7b", "messages": [ {"role": "user", "content": "What is 15% of 280?"} ], "tools": tools, "tool_choice": "auto" }) result = response.json()# The model may return a tool call which you can executeif result["choices"][0].get("tool_calls"): tool_call = result["choices"][0]["tool_calls"][0] # Execute the tool call...API Reference
| Endpoint | Method | Description |
|---|---|---|
| /api/tools/ | GET | List all available tools |
| /api/tools/execute | POST | Execute a specific tool |
| /api/tools/parse | POST | Parse tool calls from text |
| /api/tools/process | POST | Parse and execute tool calls |
| /api/tools/openai/functions | GET | Get OpenAI-compatible format |