Make AIShell an MCP client to expose MCP tools to its agents #392
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.


PR Summary
This PR enables MCP client support in AIShell and also updates the
openai-gptagent to consume tools.The detailed changes are:
Updated the
AIShell.Abstraction.IShellinterface to include the following 2 new methodsTask<List<AIFunction>> GetAIFunctions()-- get available tools for AI model to use.Task<FunctionResultContent> CallAIFunction(...)-- allow an agent to call a tool.We use types like
AIFunction,FunctionCallContent, andFunctionResultContentfrom the NuGet packageMicrosoft.Extensions.AI.Abstractionsas the general representation of a "tool", a "tool call request", and a "tool call result". This allows an agent to leverage the framework libraryMicrosoft.Extensions.AIeasily. And for agents that doesn't use that library, they can still use the exposed tools by callingCallAIFunction(...)directly.Supported the
$HOME/mcp.jsonfile for configuring local or remote MCP servers. Here is an example of the configuration format:{ "servers": { "Everything": { "type": "stdio", "command": "npx", "args": ["-y", "@modelcontextprotocol/server-everything"] }, "GitHub": { "type": "http", "url": "https://api.githubcopilot.com/mcp/", "headers": { "Authorization": "Bearer <github-pat>" } } } }The configuration format is based on that of the VS Code MCP, but not 100% the same -- no
envFilekey and noinputssupport.Server configuration for
stdiotype connection:type"stdio"command"npx","node","python","docker"args["server.py", "--port", "3000"]env{"API_KEY": "<key>"}Server configuration for
seeorhttptype connections:type"sse","http"url"http://localhost:3000"headers{"API_KEY": "${input:api-key}"}The new MCP components are:
McpConfig: it deserializes themcp.jsonconfiguration utilizing source generation.McpManager: it manages all configured MCP servers and expose methods to get all available tools, make a tool call, and stop/start a given server.McpServer: it represents an MCP server. It connects to the server in asynchronous way, reports the server status, and return the available tools from the server.McpTool: it represents an MCP tool. It implementsAIFunctionto hook up to theMicrosoft.Extensions.AIframework and also make sure calls to MCP tools all go through the "tool call request" prompt for the user to approve./mcpcommand: it shows a table of all servers and their available tools, including the status of the server, and error message if a server fails.The MCP components depend on the official .NET MCP SDK
ModelContextProtocol.Coreto handle MCP servers under the hoods. Internally, ourMcpServerandMcpToolwrapIMcpClientandMcpClientToolrespectively.The
openai-gptagent was refactored to use the framework libraryMicrosoft.Extensions.AI.OpenAI. This makes it extremely intuitive to consume exposedAIFunctioninstances from AIShell. The estimated token count tracking code was removed from the agent, because after switching toMicrosoft.Extensions.AI.OpenAI, the framework library will make the subsequent round of chat requests automatically after a tool call result is back, and this will certainly not go through our existing token count tracking code. It was an estimate anyways, so if needed, we can use the token usage information from the response to help shrink the chat history.Two new methods were added to
HostRenderToolCallRequest-- used byMcpToolto display a tool call request.RenderMcpServersAndTools-- used by/mcpcommand to display all servers and tools.Updated
FancyStreamRenderto work properly when the host writes output while aFancyStreamRenderis still active. It's very possible that an AI model could return some text before making a tool call. In such a case, the tool call request will be rendered while aFancyStreamRenderinstance is still active. AFancyStreamRenderinstance needs to track the initial cursor position to refresh the whole output when there are new text chunks coming in, so we need to make it aware of the tool call request output, so that it can adjust its cursor position and refresh its state correctly.Updated
LoadContext.csto make sure forSystem.CommandLine.dllandMicrosoft.Extensions.AI.Abstractions.dll, all agents share the same assembly loaded in the default load context. This is because the interfaces exposed inAIShell.Abstractiondepend on types from those 2 assemblies, so if an agent loads its own version of those assemblies into its custom load context, the contracts will break.Simplified the package references.
Follow-up work
GetAIFunctions()should only update the returned list when tools are changed./mcpcommand should support the following sub commands:/mcp stop <server>-- stop a server./mcp start <server>-- start a server./mcp refresh-- refresh all servers if MCP configuration changed.