
As AI agents become central to modern orchestration, the need for standardized, tool-exposing servers grows. Enter Model Context Protocol (MCP)—a JSON-RPC-based protocol that lets you define reusable tools and resources for LLMs to invoke. In this guide, we’ll walk through building a minimal MCP server using C# and .NET 8.
⚙️ What Is MCP?
MCP allows you to expose structured tools (like getCustomer, listProducts, or runDiagnostics) to AI clients via a unified protocol. It solves the N×M integration problem by letting any MCP-compliant client talk to any MCP-compliant server.
🧰 Prerequisites
- .NET 8 SDK
- Visual Studio or VS Code
- NuGet access
- Optional: GitHub Copilot extension (for testing AI integration)
🏗️ Step-by-Step Setup
1. Create the Project
dotnet new webapi -n McpDemoServer
cd McpDemoServer
2. Add the MCP SDK
dotnet add package Anthropic.Mcp.Server
🧩 Define Your Data Model
Create a simple model in Models/Customer.cs:
public record Customer(string Id, string Name, string Email);
🛠️ Define the Service Interface
In Services/ICustomerService.cs:
public interface ICustomerService
{
Task<Customer?> GetCustomerByIdAsync(string id);
Task<IEnumerable<Customer>> ListCustomersAsync();
}
🧪 Implement the Service
In Services/InMemoryCustomerService.cs:
public class InMemoryCustomerService : ICustomerService
{
private static readonly List<Customer> _data = new()
{
new Customer("1", "Alice", "alice@example.com"),
new Customer("2", "Bob", "bob@example.com")
};
public Task<Customer?> GetCustomerByIdAsync(string id) =>
Task.FromResult(_data.FirstOrDefault(c => c.Id == id));
public Task<IEnumerable<Customer>> ListCustomersAsync() =>
Task.FromResult<IEnumerable<Customer>>(_data);
}
🔌 Configure MCP in Program.cs
using Anthropic.Mcp.Server;
using McpDemoServer.Services;
var builder = WebApplication.CreateBuilder(args);
builder.Services.AddSingleton<ICustomerService, InMemoryCustomerService>();
builder.Services.AddMcpServer(options =>
{
options.AddResource<Customer>("customer");
options.AddTool("getCustomer", typeof(ICustomerService)
.GetMethod(nameof(ICustomerService.GetCustomerByIdAsync))!);
options.AddTool("listCustomers", typeof(ICustomerService)
.GetMethod(nameof(ICustomerService.ListCustomersAsync))!);
});
var app = builder.Build();
app.MapMcpServer("/mcp");
app.Run();
🚀 Run the Server
dotnet run
Your MCP server is now live at http://localhost:5000/mcp, exposing two tools: getCustomer and listCustomers.
🧠 Bonus: Testing with GitHub Copilot
If you're using GitHub Copilot in VS Code, you can configure it to talk to your MCP server:
// .vscode/mcp.json
{
"servers": {
"McpDemoServer": {
"type": "stdio",
"command": "dotnet",
"args": ["run", "--project", "McpDemoServer"]
}
}
}
Then prompt Copilot with:
“Get customer with ID 1”
It will invoke your tool and return structured output.
🔮 Final Thoughts
MCP servers are the future of AI orchestration—modular, scalable, and LLM-friendly. Whether you're building HR tools, medical directories, or embedded diagnostics, MCP lets you expose your logic in a way that AI agents can understand and invoke.
Let’s make your AI orchestration truly plug-and-play.
Comments (0)