What We’ve All Been Waiting For: MCPs

Back in 2021-ish, I started playing around with AI—not GenAI, but traditional AI models in AWS. Building, experimenting, figuring things out. Then 2023 hit, and we saw the first real wave of LLMs producing GenAI. It was good—sort of.

But have you ever had that feeling that something’s still missing?

We’ve all been moving forward—building, integrating, optimizing—but still waiting for that missing piece to truly elevate the GenAI experience. So the industry gave us agents, agentic workflows, whatever buzzword you prefer—but at the end of the day, it’s just smarter automation.

Well, now there’s MCPs.

For those of us deep in data interoperability, compliance, and system integration, Model Context Protocols (MCPs) are finally stepping up in a way that actually matters. They bring structure to multi-agent AI workflows—adding memory, persistence, and better interoperability across models and systems.

Use Cases: Why This Matters

AI-Driven Infrastructure – Imagine telling an AI:
“Set up three servers, a load balancer, and a database in AWS.”
MCPs make automated cloud provisioning a reality.

Enterprise Automation – O365 admins could say:
“Onboard a new employee: create their email, Teams account, and assign permissions.”
MCPs streamline IT workflows, cutting manual work.

AI-Powered Personal Assistants – With an MCP layer on mobile networks, users could say:
“Book me a flight, a hotel near the beach, and a dinner reservation for two.”
MCPs orchestrate real-world tasks, integrating flights, hotels, and rideshares.

This isn’t just a shift—it’s a correction.

And for those of us who’ve been watching, waiting, and building, this is exactly what we’ve been gearing up for.

MCPs, it’s go time.