As excited as I am about the potential of MCP adoption to transform how we use AI, it is fair to say we are still early in the adoption of MCP within the enterprise. Yes, there are companies who are sharing how they are adopting them broadly, Bloomberg gave an excellent talk at the last MCP Dev Summit, it is still just arriving on many IT organizations’ desks.    I think this is going to change pretty quickly, and within the next year, most CIOs will be rushing to implement an enterprise MCP strategy and pushing their engineering orgs to deliver MCP servers quickly to empower the business.  
If you’ve not been following the rise of the Model Context Protocol (MCP) over the last year, you’ve missed a key development in the adoption of generative AI.  MCP servers are becoming a standard for how Apps and data are exposed to LLMs, AI agents and chat users.   The MCP project provides a standard way for engineers to create an interface on top of applications and data, that users can access via an LLM.  
For enterprises, the implications are enormous. With the emergence of MCP servers, it’s no longer just technical teams who can build AI-powered tools. Non-technical users can now create powerful agents and workflows that interact with existing applications and data, all while using their own identity and access privileges. And because MCP servers are reusable building blocks, organizations can assemble them in novel ways to create custom tools that align with unique business needs.
This is why now is the time for every company to begin developing an enterprise MCP strategy—a plan for how to accelerate the impact of LLMs across the business while keeping governance and security in place.
1. Expanding Access to LLMs and MCP Servers
The first step in an enterprise MCP strategy is widening access. More employees should be able to use LLMs in combination with MCPs, not just developers or AI specialists.
The good news is that this is getting easier every day. Popular enterprise chat platforms and AI tools are adding support for MCP integration, including Microsoft Copilot, Anthropic’s Claude, Obot Chat, and Block’s Goose. This means end users can start experimenting with MCPs directly from the interfaces they already use. While ChatGPT hasn’t yet allowed users to bring their own MCP connections, it has adopted the standard and is incorporating it into a number of projects.
When employees have direct, safe access to MCP-powered capabilities, they’ll find creative new ways to solve problems and automate workflows—without needing to wait on IT or development teams. For IT teams, the time is now to ensure that users have AI chat tools that offer full support for MCP servers, including advanced elements within the spec, such as elicitation, sampling and MCP-UI.
2. Scaling the Creation of Business-Aligned MCPs
The next step is to think about MCP creation. The initial approach for many teams around MCPs was to build them on top of standard APIs, in a 1:1 alignment. An MCP for every app. More and more, organizations are rethinking this and leaning toward MCP servers that align with key business functions, not apps.
Take HR as an example. Should an “HR MCP” just be a simple set of standard tools (like pulling employee data or automating onboarding tasks)? Or should it be a more advanced, layered server that connects to multiple systems—benefits, payroll, learning platforms—and exposes a rich set of actions employees can take?
The latter approach offers much more flexibility and power. By treating MCPs as modular, reusable building blocks, enterprises can design servers that become critical enablers for day-to-day work across departments. MCP servers can provide detailed instructions to the LLM on their capabilities, and align them more with human sentiment, versus simply mapping them to API capabilities. These next-gen MCPs will feel more like agents in many ways, and less like API endpoints.
3. Planning for Large-Scale Management and Security
Finally, no enterprise MCP strategy is complete without a plan for security and scale. As usage grows, organizations will need to handle:
- Hosting and running MCP servers at scale
- Defining fine-grained access control policies
- Tracking usage, auditing interactions, and maintaining compliance
- Supporting employees who are experimenting and building with MCPs
Focusing on discovery and documentation will help employees understand what these new MCP-servers can do. At the same time, policy and governance, along with a strong dose of centralized management, will allow this new layer of technology to be delivered reliably.
This is where we believe a gateway layer becomes essential.
Why We Built the Obot MCP Gateway
We built and open-sourced the Obot MCP Gateway to give organizations a secure and scalable way to manage the rapid growth of MCPs.
Obot provides IT teams with a control plane for onboarding MCP servers, defining access policies, and monitoring usage. It also gives employees a catalog where they can discover approved MCPs, learn about their capabilities, and connect them directly to their AI clients—or use them through a built-in chat interface.
By providing both governance for IT and enablement for users, Obot layes the groundwork for enterprises to adopt MCP as a standard while maintaining security and control.
We believe this is the foundation of a modern enterprise MCP strategy—one that empowers more users, unlocks business-aligned MCPs, and ensures enterprises can scale AI adoption safely.
Conclusion
MCP adoption in the enterprise is about to expand massively. Organizations that plan ahead will be able to unlock AI’s potential faster, while avoiding the pitfalls of unmanaged, fragmented adoption.
An enterprise MCP strategy is going to be a critical key to scaling AI responsibly. I think Obot can be an important, open-source tool to help organizations get there quickly.
 
         
                                                     
                                                    