AI Over 40 Series - Week 21: What MCP Servers Are (and Why I’m Building My Own)

AI Over 40 Series - Week 21: What MCP Servers Are (and Why I’m Building My Own)

The breakthrough this week wasn’t technical. It was mental.

After 20 weeks of exploring AI as a business leader—not a developer—I finally understood how to think about MCP servers. More importantly, I realized I may need to build one myself.

If you’ve heard the term “MCP server” and quietly nodded without fully understanding it, this article is for you.

What is an MCP server?

Earlier in this series, I described AI as a master improv actor—able to play any role you need.

But there’s a limitation. Your actor only knows what you tell them. Every conversation starts with you providing context, uploading files, or pasting data.

An MCP server changes that.

MCP stands for Model Context Protocol. In practice, it’s a bridge that lets your AI assistant access external systems directly.

Without an MCP server, you export a report, upload it into Claude or ChatGPT, explain the columns, and then ask your question.

With an MCP server, you ask the question, and the AI retrieves the data directly from the source system.

That’s not just convenience. It’s the difference between working from static snapshots and accessing live business data.

My first experience with Microsoft Learn

My first meaningful exposure to MCP servers was through the Microsoft Learn MCP server.

For our developers, consultants, business analysts, project managers, and support team, it has been transformative. Instead of manually searching documentation and piecing together answers, we can bring Microsoft’s published guidance directly into our AI conversations.

We can validate customization strategies. We can confirm whether functionality already exists before building something new. We can align our designs with Microsoft’s principles.

The AI doesn’t just search documents. It synthesizes guidance in the context of our specific client situation.

It became essential to how we work.

But it also shaped my thinking in a limiting way. I began to assume MCP servers were primarily for knowledge bases.

The abstraction gap

My next experiment was the Zapier MCP server. That’s where I ran into friction.

I assumed the MCP server would expose everything available through Zapier’s native connectors. It didn’t.

When I built my Outlook-to-Todoist automation in Week 18, I couldn’t use the MCP server. I had to build the automation directly in Zapier because the MCP layer exposed only a subset of functionality.

That’s when I understood something important: an MCP server does not have full API access. It’s an abstraction layer built on top of an API.

Whoever builds the MCP server decides what to expose. Their priorities may not match yours.

I call this the abstraction gap—the distance between what a system can do and what an MCP server allows AI to access.

This means an MCP server doesn’t guarantee full system access. Native integrations may offer more capabilities. And if a published MCP server doesn’t solve your problem, you’re back to finding alternatives.

It also means you should only use MCP servers from trusted sources. Given the right permissions, they can create, read, update, and delete records.

Where I got stuck

After the success with Microsoft Learn and the disappointment with Zapier, my thinking narrowed.

I viewed MCP servers as advanced connections to specialized knowledge bases. If I couldn’t identify another “knowledge base” to connect, I stopped exploring.

Meanwhile, I was spending hours each month on manual work.

The report problem

My budgeting and forecasting analysis lives in Business Central and other systems. The data exists. But accessing answers required waiting for someone else to build reports—or exporting data and manipulating it myself.

When waiting wasn’t an option, I would export data, open Excel, copy and paste, write formulas, restructure accounts, and rebuild analysis from scratch. Every month, something changed. New accounts. Updated requirements. Different structures.

I had normalized that inefficiency.

The breakthrough

One afternoon, while manipulating yet another spreadsheet, it clicked.

The data in Business Central is already available through APIs and OData feeds. I have been manually extracting and transforming information that is already programmatically accessible.

I don’t need to wait for someone else to publish an MCP server for my specific needs. I could potentially build one.

A custom MCP server that connects directly to the data sources I use would allow me to query live data from within an AI conversation, provide business context, iterate on analysis in real time, and maintain month-over-month continuity rather than starting from scratch.

This is not about novelty. It’s about leverage.

The honest caveat

I haven’t built this yet.

Knowing something is theoretically possible is different from proving a non-technical business leader can accomplish it with AI guidance. This is more ambitious than previous automations I’ve tackled.

But I’m committing to trying. I’ll use AI as my advisory board to help me design and build a custom MCP server in Azure that connects to the data I need.

If I succeed, I’ll document how. If I fail, I’ll share what I learned. Either way, it moves the conversation forward.

The consumer-to-creator shift

The real insight this week isn’t technical. It’s mindset.

For months, I was waiting for someone else to build reports. Waiting for someone else to publish the right MCP server. Filling the gap with manual Excel work because that was familiar.

Now I’m asking a different question: if the data has an API, why can’t I connect AI to it directly?

That shift—from consumer to creator—is where real leverage begins.

A note on platforms

I’ve referenced Claude specifically because it currently makes MCP connections accessible directly within the chat interface.

Other platforms allow similar capabilities but with more friction. Gemini typically requires an integrated development environment or a custom application. ChatGPT requires custom connectors in development mode, which disables memory. Microsoft Copilot requires extension through Copilot Studio.

All can support advanced integrations, but Claude currently lowers the barrier for business leaders who are not developers.

Your week 21 challenge: map your data sources

This week, take a practical step.

Identify the manual reports you rebuild regularly. Trace where that data actually lives. Consider whether an MCP server already exists—and whether it exposes what you truly need. Then ask a different question: could I build one?

Start with a single recurring business question that consumes time each month. That’s your entry point.

The bottom line

MCP servers aren’t magic. They are bridges between AI and your business systems.

The bridges others build may not go where you need to go.

But you don’t have to wait for someone else to build yours.

I don’t yet know if I can pull this off. But I’m done waiting to find out.

This post is part of my “AI Over 40” series. It first appeared on LinkedIn: AI for the Over 40 [Week 21]: What MCP Actually Are and Why I’m Building My Own.

Next Up: When everything is an agent, nothing is an agent.

Read more AI and Copilot blogs.

Stay Informed

Choose Your Preferences

"*required" indicates required fields

This field is for validation purposes and should be left unchanged.
Subscription Options
By subscribing you are consenting to receiving emails from ArcherPoint and agreeing to the storing & processing of your personal data as described in our Privacy Policy. You can can unsubscribe at any time.