Skip to main content

MCP Implementation

Model Context Protocol (MCP) is an open standard for connecting AI agents with external tools and services. MCP powers this interaction by structuring context and commands in a consistent format. On aiXplain, it enables seamless integration of tools into your agent. This means agents can reason, plan, and act using MCP tool outputs.

With MCP, you can:

  • Integrate tools written in different languages
  • Securely access local or remote infrastructure
  • Scale multi-step AI workflows across agents

How it works

  1. Onboard your MCP server to aiXplain: Register the MCP server as a tool so it becomes discoverable and callable by agents.
  2. Scope the internal actions: Clearly define the set of actions your MCP tool exposes. For optimal performance and reasoning, we recommend limiting each MCP server to around 10 actions per agent.
  3. Add the MCP tool to your agent: Once onboarded, the MCP tool can be added to your agent's toolset in the aiXplain platform.
  4. Runtime execution: During execution, the agent evaluates the tool’s name, description, and input/output schema to decide which tool to use. It then calls the tool as part of its plan. The MCP server runs the function and returns structured results to the agent for continued reasoning and response generation.

Supported transports and languages

STDIO - Local, low-latency communication between agents and MCP servers for same-machine communication.

Languages - Python, Node.js, TypeScript, C#, Swift

Deploying an MCP server as a tool

You can integrate any MCP-compliant server as a callable tool in your aiXplain agent.

Step 1: Find or build your MCP server

You can either create your own MCP server or use an existing one. For example, this repository contains a basic MCP server for Slack.

tip

Each MCP server can expose one or more actions as tools.

Step 2: Dockerizing MCP Server

Create Dockerfile and requirements.txt

Build image:

docker build -t aixplain-mcp_math .

Run locally:

docker run -p 8000:8000 aixplain-mcp_math

Tag + Export:

docker tag aixplain-mcp_math:latest aixplain-mcp_math:1.0.1
docker save -o aixplain-mcp_math.1.0.1.tar aixplain-mcp_math:1.0.1

Step 3: Deploy the MCP server to aiXplain OnPrem Only

Transfer & Load Docker Image

Once you transfer the image to server. Load the image to Docker.

docker load -i images/aixplain-mcp_math.1.0.1.tar

Register the tool by adding it to mcpservers.json

Navigate to the mcpservers.json configuration file on the aiXplain server and add an entry for the new mcpserver to mcpservers.json.

Update the mcpservers.json config file:

{
"mcpServers":{
"tools":{
"command": "docker",
"args" : [
"run",
"--i",
"--rm",
"aixplain-slack"
]
}
}
}
note

aiXplain OnPrem auto-reloads the config changes and connects to the mcpserver within 30 seconds.

Once deployed, the function would be available as a utility to add to agent architecture.