Skip to content

Quick Start: State Transfer with MCP

In this quick start, we'll demonstrate how fast-agent can transfer state between two agents using MCP Prompts.

Welcome Image

First, we'll start agent_one as an MCP Server, and send it some messages with the MCP Inspector tool.

Next, we'll run agent_two and transfer the conversation from agent_one using an MCP Prompt.

Finally, we'll take a look at fast-agent's prompt-server and how it can assist building agent applications

You'll need API Keys to connect to a supported model, or use Ollama's OpenAI compatibility mode to use local models.

The quick start also uses the MCP Inspector - check here for installation instructions.

Step 1: Setup fast-agent

# create, and change to a new directory
mkdir fast-agent && cd fast-agent

# create and activate a python environment
uv venv
source .venv/bin/activate

# setup fast-agent
uv pip install fast-agent-mcp

# create the state transfer example
fast-agent quickstart state-transfer
# create, and change to a new directory
md fast-agent |cd

# create and activate a python environment
uv venv
.venv\Scripts\activate

# setup fast-agent
uv pip install fast-agent-mcp

# create the state transfer example
fast-agent quickstart state-transfer

Change to the state-transfer directory (cd state-transfer), rename fastagent.secrets.yaml.example to fastagent.secrets.yaml and enter the API Keys for the providers you wish to use.

The supplied fastagent.config.yaml file contains a default of gpt-4o - edit this if you wish.

Finally, run uv run agent_one.py and send a test message to make sure that everything working. Enter stop to return to the command line.

Testing the Agent

Step 2: Run agent one as an MCP Server

To start "agent_one" as an MCP Server, run the following command:

# start agent_one as an MCP Server:
uv run agent_one.py --server --transport sse --port 8001
# start agent_one as an MCP Server:
uv run agent_one.py --server --transport sse --port 8001

The agent is now available as an MCP Server.

Note

This example starts the server on port 8001. To use a different port, update the URLs in fastagent.config.yaml and the MCP Inspector.

Step 3: Connect and chat with agent one

From another command line, run the Model Context Protocol inspector to connect to the agent:

# run the MCP inspector
npx @modelcontextprotocol/inspector
# run the MCP inspector
npx @modelcontextprotocol/inspector

Choose the SSE transport type, and the url http://localhost:8001/sse. After clicking the connect button, you can interact with the agent from the tools tab. Use the agent_one_send tool to send the agent a chat message and see it's response.

Using the Inspector to Chat

The conversation history can be viewed from the prompts tab. Use the agent_one_history prompt to view it.

Disconnect the Inspector, then press ctrl+c in the command window to stop the process.

Step 4: Transfer the conversation to agent two

We can now transfer and continue the conversation with agent_two.

Run agent_two with the following command:

# start agent_two as an MCP Server:
uv run agent_two.py
# start agent_two as an MCP Server:
uv run agent_two.py

Once started, type '/prompts' to see the available prompts. Select 1 to apply the Prompt from agent_one to agent_two, transferring the conversation context.

You can now continue the chat with agent_two (potentially using different Models, MCP Tools or Workflow components).

Transferred Chat

Configuration Overview

fast-agent uses the following configuration file to connect to the agent_one MCP Server:

fastagent.config.yaml
# MCP Servers
mcp:
    servers:
        agent_one:
          transport: sse
          url: http://localhost:8001

agent_two then references the server in it's definition:

agent_two.py
# Define the agent
@fast.agent(name="agent_two",
            instruction="You are a helpful AI Agent",
            servers=["agent_one"])

async def main():
    # use the --model command line switch or agent arguments to change model
    async with fast.run() as agent:
        await agent.interactive()

Step 5: Save/Reload the conversation

fast-agent gives you the ability to save and reload conversations.

Enter ***SAVE_HISTORY history.json in the agent_two chat to save the conversation history in MCP GetPromptResult format.

You can also save it in a text format for easier editing.

Prompt Picker

By using the supplied MCP prompt-server, we can reload the saved prompt and apply it to our agent. Add the following to your fastagent.config.yaml file:

fastagent.config.yaml
# MCP Servers
mcp:
    servers:
        prompts:
            command: prompt-server
            args: ["history.json"]
        agent_one:
          transport: sse
          url: http://localhost:8001

And then update agent_two.py to use the new server:

agent_two.py
# Define the agent
@fast.agent(name="agent_two",
            instruction="You are a helpful AI Agent",
            servers=["prompts"])

Run uv run agent_two.py, and you can then use the /prompts command to load the earlier conversation history, and continue where you left off.

Note that Prompts can contain any of the MCP Content types, so Images, Audio and other Embedded Resources can be included.

You can also use the Playback LLM to replay an earlier chat (useful for testing!)