- Nishant's TechBytes
- Posts
- Bridging Legacy Systems with AI: The MCP Revolution
Bridging Legacy Systems with AI: The MCP Revolution
The financial industry stands at an inflection point. While artificial intelligence promises to transform how we underwrite policies, process claims, and serve customers, most insurance companies remain trapped by legacy systems that resist modern integration. Enter the Model Context Protocol (MCP), a standardised bridge that finally allows AI to speak fluently with both cutting-edge and decades old financial infrastructure.
Think of MCP as the universal translator for insurance technology. Just as USB-C eliminated the chaos of proprietary connectors, MCP provides a single, standardised way for AI models to interact with policy databases, claims systems, customer portals, and regulatory compliance tools. This isn't just another API, it's the missing piece that makes truly intelligent insurance operations possible.
The Integration Challenge
If you’ve ever worked in a bank, insurance firm, or any long-standing financial institution, you know one truth: legacy systems don’t die, they linger. And yet, these same organisations are under immense pressure to deliver AI-driven, real-time digital experiences. How do we connect the past and the future without spending years and millions on rewrites?
Traditional companies operate on a patchwork of systems: mainframe policy databases from the 1980s, claims processing software from different decades, and modern customer portals that struggle to communicate with their legacy backends. Each integration has historically required custom development, creating what Anthropic calls the "N×M problem" where N applications need M different integrations
Model Context Protocol (MCP) is emerging as a powerful way to bridge this gap. It’s not just another API convention, it’s a model-aware, context-driven approach to system integration that enables AI agents and modern applications to interact with older business logic without losing meaning, traceability, or flexibility.
Let's walk through building an MCP enabled system for policy risk assessment with local AI using Ollama.
Prerequisites
.NET 9 SDK or newer.
Ollama installed locally (
curl -fsSL https://ollama.ai/install.sh | sh
) and a function-calling capable model such asqwen2.5:7b
orllama3:8b
.NuGet packages (all prerelease as of July 2025):
dotnet add package ModelContextProtocol --prerelease
dotnet add package Microsoft.Extensions.Hosting
dotnet add package Microsoft.Extensions.AI --prerelease # abstractions
dotnet add package Microsoft.Extensions.AI.Ollama --prerelease # chat client
(The Ollama package is marked “deprecated” but still works for local testing; production teams often switch to OllamaSharp
.)
1. Build the Credit-Risk MCP Server
Create a console project called RiskMcpServer
:
dotnet new console -n RiskMcpServer
cd RiskMcpServer
Replace your Program.cs
with following code:
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using Microsoft.Extensions.Logging;
using ModelContextProtocol.Server;
using System.ComponentModel;
var builder = Host.CreateApplicationBuilder(args);
// Send logs to stderr for IDEs like Cursor or VS Code.
builder.Logging.AddConsole(o => o.LogToStandardErrorThreshold = LogLevel.Trace);
// Register the MCP server, use STDIO transport, scan assembly for tools.
builder.Services
.AddMcpServer()
.WithStdioServerTransport()
.WithToolsFromAssembly();
await builder.Build().RunAsync();
[McpServerToolType] // register all methods below
public static class CreditRiskTool
{
// Data holder – not required by MCP, but convenient.
public record CreditAssessmentContext(
string CustomerId,
decimal Income,
int CreditScore,
int MissedPaymentsLastYear,
string EmploymentStatus,
string RiskCategory = "",
string Explanation = ""
);
[McpServerTool, Description("Assess credit-risk category for a loan applicant.")]
public static CreditAssessmentContext AssessRisk(
string customerId,
decimal income,
int creditScore,
int missedPaymentsLastYear,
string employmentStatus)
{
var context = new CreditAssessmentContext(
customerId, income, creditScore, missedPaymentsLastYear, employmentStatus);
if (creditScore >= 750 && missedPaymentsLastYear == 0)
{
context = context with { RiskCategory = "Low",
Explanation = "Excellent credit score and spotless history." };
}
else if (creditScore >= 600)
{
context = context with { RiskCategory = "Medium",
Explanation = "Average credit, 1–2 late payments." };
}
else
{
context = context with { RiskCategory = "High",
Explanation = "Sub-600 score or multiple delinquencies." };
}
return context;
}
}
Run your application using following command:
dotnet run --project RiskMcpServer
The process now listens on STDIN/STDOUT exactly as MCP clients expect.
2. Connect a .NET MCP Client + Local Ollama LLM
Create another console app RiskChatClient
:
dotnet new console -n RiskChatClient
cd RiskChatClient
dotnet add reference ../RiskMcpServer/RiskMcpServer.csproj # optional reuse of record
2.1 Discover MCP Tools
using Microsoft.Extensions.AI;
using Microsoft.Extensions.AI.Ollama;
using ModelContextProtocol.Client;
using ModelContextProtocol.Client.Transport;
using System.Text.Json;
// 1. Connect to the local STDIO server
var transport = new StdioClientTransport(
new() { Command = "dotnet", Arguments = ["run", "--project", "../RiskMcpServer"] });
await using var mcpClient = await McpClientFactory.CreateAsync(transport);
// 2. Dump available tools
var tools = await mcpClient.ListToolsAsync();
foreach (var t in tools) Console.WriteLine($"{t.Name} — {t.Description}");
// 3. Wrap them as AIFunctions automatically (McpClientTool ⇢ AIFunction)
2.2 Chat with Ollama + MCP Tool Choice
// Start Ollama separately: `ollama run qwen2.5:7b`
var chat = new OllamaChatClient(new Uri("http://localhost:11434"), "qwen2.5:7b");
// Build chat options so the LLM can pick from MCP tools
var chatOptions = new ChatOptions { Tools = [.. tools] };
// Ask a question — the model may choose to call AssessRisk internally
var answer = await chat.GetResponseAsync(
"I have a customer earning $85,000, score 710, 1 missed payment. Should we approve?",
chatOptions);
Console.WriteLine(answer);
What happens under the hood:
Ollama analyses the prompt and decides it needs the
AssessRisk
tool.The chat client emits an MCP
callTool
request automatically viaMcpClientTool
RiskMcpServer
executes the C# logic and returns a structured JSON result.Ollama uses that context to craft a final reply, e.g:
This applicant is Medium-risk (score 710, 1 missed payment). Consider a
0.75 percentage-point premium or require a co-signer.
3. Plain .NET API Call to Ollama (Without MCP)
Sometimes you just need raw text generation.
using System.Net.Http.Json;
var client = new HttpClient { BaseAddress = new Uri("http://localhost:11434") };
var payload = new
{
model = "qwen2.5:7b",
prompt = "What's the weather in Miami?",
stream = false
};
var resp = await client.PostAsJsonAsync("/api/generate", payload);
var content = await resp.Content.ReadFromJsonAsync<JsonElement>();
Console.WriteLine(content.GetProperty("response").GetString());
No tool calls, no context, its pure LLM text in/out.
4. Side-by-Side Comparison
Dimension | Ollama + MCP (.NET) | Plain Ollama REST |
---|---|---|
Tool discovery | Automatic via | N/A |
Business logic location | Strongly-typed C# tools in | None |
Explainability / Audit Trail | JSON result + risk rationale returned to chat | Opaque LLM text |
Transport | STDIO, HTTP/SSE, or WebSocket via MCP SDK | HTTP only |
Cognitive load on the LLM | Focus on reasoning; heavy lifting done by tools | LLM must infer all data |
Ideal use cases | Underwriting, claims, KYC, regulated workflows | Casual Q&A, prototyping |
5. Running Everything Together
# Terminal 1 – MCP server
dotnet run --project RiskMcpServer
# Terminal 2 – Chat client (will spin up its own server instance too)
dotnet run --project RiskChatClient
Ask:
User > Evaluate applicant C-98765: income 85K, score 710, missed 1 payment.
And the console shows LLM response with enriched explanation:
AssessRisk(...) → MCP call
RiskCategory: Medium
Explanation : Average credit profile. Moderate risk.
6. Production Variations
HTTP Endpoint: Swap
.WithStdioServerTransport()
for.WithAspNetCore()
to expose/mcp
on HTTPS.ASP.NET Aspire: Host the server in an Aspire microservice and call it from Blazor front-ends.
Tool Chaining: Register multiple
[McpServerToolType]
classes (e.g., fraud-scores, bureau pulls) to let the LLM orchestrate multi-step workflows.Cloud LLMs: Replace
OllamaChatClient
withOpenAIChatClient
orBedrockChatClient
; the MCP wiring remains identical.
7. Key Takeaways
I’ve seen many enterprises sink time and budget into “transformational” rewrites that never shipped. But when you use MCP to wrap legacy intelligence in modern context, the results are near-instant: faster integrations, reusable business logic, and systems that talk the same language.
Aspect | Traditional API | MCP |
Discovery | Manual documentation | Dynamic tool discovery |
Context | Stateless requests | Context-aware interactions |
AI Integration | Custom parsing required | Native LLM compatibility |
Protocol | Various (REST, SOAP) | Standardised JSON-RPC 2.0 |
In architecture, our real job isn’t to replace everything. It’s to make everything talk as if it were built yesterday.
(All code tested on .NET 9 preview 6 with ModelContextProtocol 0.1.0-preview.12. Remember to update package versions as the SDK evolves.)
Reply