Unlock the Power of MCP: Build Your Own AI-Powered Business
Unlock the power of MCP and build your own AI-powered business. Learn how to create a unified protocol for your AI agents to access external assistance, and explore opportunities in the MCP ecosystem for startups and entrepreneurs.
2025年4月22日

Discover the power of MCP (Multimodal Capability Protocol) and how it can revolutionize your AI development workflow. In this blog post, you'll learn the easiest way to build your own MCP business and tap into the growing MCP ecosystem. Explore the opportunities and step-by-step guidance to create your own MCP server and distribute it to others, unlocking new possibilities for your AI applications.
What is MCP and How it Differs from Previous Solutions
The Opportunities in the MCP Ecosystem
How to Build Your Own MCP Server from Scratch
Distribute Your MCP to Others via Marketplaces
Conclusion
What is MCP and How it Differs from Previous Solutions
What is MCP and How it Differs from Previous Solutions
mCP (Model Calling Protocol) is a unified way for AI agents or AI applications to access external assistance. It provides a standard format for how AI models can communicate with external systems, unlike previous solutions where each large language model provider had their own format.
The key difference is that mCP offers a standardized protocol, similar to how TCP/IP standardized communication across the early internet. Before TCP/IP, each company had its own proprietary protocol, making it difficult to build integrated solutions. mCP aims to do the same for the AI ecosystem, allowing AI agents to easily connect to various external services and capabilities.
This standardization lowers the barrier for building new AI agent clients, as they can simply connect to existing mCP servers rather than having to build custom integrations. It enables a more open and interoperable AI ecosystem, where different AI models and applications can seamlessly work together.
In contrast, previous solutions like OpenAI's Curl allowed AI agents to take actions in the real world, but each provider had their own format. mCP aims to unify this, providing a common language for AI agents to interact with external systems, similar to how the internet protocol standards enabled the growth of the World Wide Web and internet-based businesses.
The Opportunities in the MCP Ecosystem
The Opportunities in the MCP Ecosystem
The mCP (Multi-Modal Capability) ecosystem has seen a significant explosion in the past few weeks, presenting exciting startup opportunities. While there are already numerous resources available to learn about what mCP is, this section will focus on the potential startup opportunities within the mCP ecosystem.
One key aspect of mCP is its ability to provide a unified way for AI agents or applications to access external assistance. This is similar to the early days of the internet, where the introduction of standard protocols like TCP/IP enabled seamless communication across different devices and platforms. Similarly, mCP can serve as a unifying protocol for large language models to interact with external systems, lowering the barrier for building new AI agent clients.
This standardization opens up opportunities for companies to build AI agent clients, such as Cursor and Anthropic's Winged Serve, which can directly interact with end-users and drive the adoption of various mCP servers. As more vertical AI agents emerge, such as sales, customer support, or general-purpose agents, the demand for diverse mCP servers will grow.
Additionally, there is an opportunity to create a marketplace for mCP servers, similar to app stores or early web directories. Platforms like Glama and Symmetry are already helping users discover and curate available mCP servers, and there is potential for businesses to specialize in building and distributing mCP servers to various users.
Furthermore, companies can focus on enhancing the user experience of existing mCP clients, as demonstrated by 20 First Step's products that integrate with tools like Cursor and Winged Serve, providing additional context and monetization opportunities for mCP servers.
In summary, the mCP ecosystem presents a range of startup opportunities, including building AI agent clients, creating mCP marketplaces, and enhancing the user experience of mCP-powered applications. As the ecosystem continues to evolve, there will be ample room for innovative companies to emerge and capitalize on the growing demand for unified AI-powered capabilities.
How to Build Your Own MCP Server from Scratch
How to Build Your Own MCP Server from Scratch
Entropic actually provides specific SDKs for Python, TypeScript, and Java, so you can use the language you're most familiar with. In this example, we'll be using the Python SDK.
First, let's install the mcp
package:
pip install mcp
Next, we'll create a new file called test_mcp.py
and define a new FastMCP
class to initiate a new MCP server. We can use the @mcp.tool
decorator to define a function and provide a description of what the function does, as well as the expected inputs and outputs.
from mcp import FastMCP
@mcp.tool
def calculate_bmi(weight: float, height: float) -> float:
"""
Calculates the Body Mass Index (BMI) given the weight and height.
Args:
weight (float): The weight in kilograms.
height (float): The height in centimeters.
Returns:
float: The calculated BMI.
"""
bmi = weight / (height / 100) ** 2
return bmi
Now, we can save the file and run the MCP server:
mcp run test_mcp.py
In the Cursor app, you can now add this MCP server by right-clicking the file, copying the path, and pasting it into the MCP settings. Once the server is running, you can use the calculate_bmi
function by prompting the Cursor agent.
To build a more complex MCP, let's create a Figma MCP that can read your Figma file and convert the design into an HTML/React page. We'll use the Figma API to fetch the file and node data.
First, let's create a new function to fetch Figma nodes:
import os
from mcp import FastMCP
import requests
def fetch_figma_nodes(file_key: str, node_id: str) -> dict:
"""
Fetches the node data from a Figma file.
Args:
file_key (str): The Figma file key.
node_id (str): The ID of the node to fetch.
Returns:
dict: The node data.
"""
figma_token = os.environ.get("FIGMA_API_TOKEN")
headers = {
"X-Figma-Token": figma_token
}
url = f"https://api.figma.com/v1/files/{file_key}/nodes?ids={node_id}"
response = requests.get(url, headers=headers)
return response.json()["nodes"][node_id]
@mcp.tool
def get_node(file_key: str, node_id: str) -> dict:
"""
Retrieves a specific node from a Figma file.
Args:
file_key (str): The Figma file key.
node_id (str): The ID of the node to retrieve.
Returns:
dict: The node data.
"""
return fetch_figma_nodes(file_key, node_id)
Now, we can add this get_node
function to our MCP server and test it out in the Cursor app. Once it's working, we can continue to build out the Figma MCP by adding more functionality, such as extracting components and handling the Figma prototype workflow.
Finally, to distribute your MCP server, you can create a README and dependencies file, then upload the project to GitHub. Platforms like GL and Symmetry provide repositories for you to publish your MCP, making it easy for others to discover and use your server.
Distribute Your MCP to Others via Marketplaces
Distribute Your MCP to Others via Marketplaces
To distribute your MCP server to others, you can leverage platforms like GL or Symmetry that provide existing repositories of MCPs. Here's how you can do it:
-
Write Documentation: Create a detailed README.md file that includes all the dependencies and instructions on how to run your MCP server. This will help others easily set up and use your MCP.
-
Upload to GitHub: Create a new GitHub repository for your MCP project and upload all the necessary files, including the README.md.
-
Submit to Marketplaces: Visit platforms like GL or Symmetry, and click on the "Add Server" button. Provide the required information, such as the name, description, and the link to your GitHub repository.
-
Wait for Review: The marketplace will review your submission to ensure it meets their guidelines. Once approved, your MCP will be available for others to discover and install.
-
Continuous Improvement: After your MCP is live on the marketplace, you can continue to improve and add new functionalities. Update the documentation and the marketplace listing accordingly.
By distributing your MCP through these marketplaces, you can make it easily accessible to a wider audience, including AI developers and enthusiasts who are looking for new and innovative MCPs to enhance their workflows.
Conclusion
Conclusion
Here is the body of the "Conclusion" section in Markdown format:
In conclusion, building your own MCP (Multi-Modal Capability) server from scratch is a straightforward process, thanks to the tools and SDKs provided by platforms like Anthropic. By leveraging these resources, you can create custom MCP servers that enhance your AI workflows or even build a business around distributing your MCP to others.
The key steps covered in this guide include:
- Set up your MCP server: Using the Anthropic Python SDK, you can quickly create a new MCP server with custom functions and descriptions.
- Integrate with external services: For example, the guide demonstrated how to connect to the Figma API to retrieve and process design data.
- Clean and optimize the response: Refining the MCP output to be more concise and easier for large language models to understand.
- Distribute your MCP: Platforms like Glaive and Symmetry provide repositories where you can share your MCP with the community.
Remember, the success of your MCP-based product or service often comes down to having a solid go-to-market strategy. The free playbook mentioned in the guide can be a valuable resource to help you plan and execute a successful launch.
If you have any questions or want to continue exploring the world of MCP and AI coding, feel free to reach out to the author or join the AI Builder Club community for support and collaboration.
常問問題
常問問題