Your Guide to Custom MCP Catalogs and Profiles for Enterprise AI Tooling
Model Context Protocol (MCP) servers are the backbone of modern AI tooling, enabling software agents to interact with data and services. As organizations scale their MCP adoption, managing and distributing these servers becomes a critical challenge. That's where Custom Catalogs and MCP Profiles come into play. These two capabilities transform how teams package, distribute, and maintain their AI tool stack. This guide answers the most common questions about these features and walks through practical implementation.
What are Custom MCP Catalogs and why do enterprises need them?
Custom MCP Catalogs are curated collections of MCP servers that organizations can create and distribute internally. Instead of each developer hunting for servers across the open internet, enterprises can centrally manage a trusted list of approved servers—including internally built ones. This solves a core pain point: ensuring security, compliance, and consistency across teams. With a custom catalog, administrators can enforce that only vetted servers are used, reducing risk while empowering developers to quickly discover and deploy the right tools. Catalogs can reference servers from Docker's MCP Catalog, community sources, and proprietary servers, all in one place.

How do MCP Profiles complement Custom Catalogs?
While Custom Catalogs handle discovery and approval at the organization level, MCP Profiles focus on portability and configurability at the developer level. A Profile is a named, portable grouping of MCP servers. Developers can define which servers they need for a specific project, run them with a single command, and share that configuration with teammates. Profiles allow individuals to easily build, run, and share their MCP tooling across different projects and teams. Together, Catalogs provide a central source of truth, while Profiles give developers the flexibility to compose the exact set of tools they need.
How can you create a custom MCP catalog using Docker?
Creating a custom catalog involves assembling a metadata file that describes the MCP servers you want to include. For example, you can reference servers from Docker's public catalog, add servers from community sources, or incorporate your own custom MCP servers built internally. The catalog is defined in a simple YAML format. Once created, you can distribute the catalog file to your team, who can then import it into Docker Desktop or use it via the CLI. Docker Desktop provides a user-friendly interface for importing catalogs, while the CLI supports more advanced automation for DevOps pipelines.
What are the steps to build and share a custom MCP catalog?
Building and sharing a custom MCP catalog involves four main steps:
- Create your MCP server – Build a standard MCP server (e.g., a dice-rolling server) and push its Docker image to a registry like Docker Hub.
- Define metadata – Write a YAML file (
mcp-dice.yaml) that describes the server, including its name, title, type, image location, and description. - Assemble the catalog – Create a catalog YAML file that combines your custom server with servers from an existing catalog (like Docker's MCP Catalog).
- Distribute and import – Share the catalog file with your team. Users can import it into Docker Desktop or use CLI commands to load the catalog and start using the included servers.
This approach gives you full control over which servers are trusted and available.

How do MCP Profiles help developers manage configurations?
MCP Profiles simplify tool management by allowing developers to define reusable, named groupings of MCP servers. Instead of manually configuring which servers to run for each task, you can create a Profile like 'data-analysis' that bundles servers for databases, file access, and APIs. Profiles can be versioned and shared via version control, ensuring everyone on the team uses the same setup. They also eliminate the need for repetitive environment setup—just run the profile and all required servers launch. Profiles make it easy to switch between different tool stacks for different projects or stages of development.
What practical use cases do Profiles solve?
Profiles address several real-world scenarios:
- Project isolation – Each project can have its own Profile, preventing conflicts between different versions of MCP servers.
- Onboarding new team members – New hires can quickly get up and running by importing the team's standard Profiles.
- Testing and CI/CD – Profiles can be used in automated pipelines to ensure consistent environments for testing MCP integrations.
- Sharing demos – Developers can package a Profile with a demo project so others can reproduce the exact setup.
By standardizing configurations, Profiles reduce errors and accelerate development workflows.
How do Custom Catalogs and Profiles advance enterprise MCP adoption?
Together, these features bridge the gap between organizational governance and developer agility. Custom Catalogs let security and platform teams curate a trusted library of MCP servers, ensuring compliance without stifling innovation. Profiles empower individual developers to compose and share exactly the tools they need for each task. This dual approach accelerates MCP adoption because it removes friction: teams don't have to reinvent the wheel every time they need a new server configuration, and they can rely on a centrally managed, secure catalog. The result is a scalable, enterprise-ready foundation for AI tooling.
Related Articles
- Experts Warn: Current Sandboxing Methods Fail to Secure AI Agents - A Breaking Investigation
- Kubernetes v1.36 Unleashes Tiered Memory Protection: New Alpha Feature Prevents OOM Kill Risks
- 10 Key Insights: How Kubernetes Became the Backbone of AI
- Red Hat Unveils AgentOps to Bridge AI Experimentation and Production
- Mastering AWS's Latest AI and Storage Integrations: A Hands-On Guide
- Microsoft Expands Sovereign Private Cloud to Support Thousands of Servers in Single Deployment
- Kubernetes v1.36 Alpha: Pod-Level Resource Managers for Smarter Resource Allocation
- Cloudflare's AI-Driven Restructuring: A New Blueprint for the Future