How to Use OpenAI MCP Integration for Building Agents?

Pankaj Singh Last Updated : 27 Mar, 2025
5 min read

To improve AI interoperability, OpenAI has announced its support for Anthropic’s Model Context Protocol (MCP), an open-source standard designed to streamline the integration between AI assistants and various data systems. This collaboration marks a pivotal step in creating a unified framework for AI applications to access and utilize external data sources effectively.

Understanding the Model Context Protocol (MCP)

Developed by Anthropic, MCP is an open standard that facilitates seamless connections between AI models and external data repositories, business tools, and development environments. By providing a standardized protocol, MCP eliminates the need for custom integrations, allowing AI systems to access necessary context dynamically. This approach enhances the relevance and accuracy of AI-generated responses by enabling real-time data retrieval and interaction.

Key Features of MCP

  • Universal Compatibility: MCP serves as a “USB-C port for AI applications,” offering a standardized method for connecting AI models to diverse data sources.
  • Two-Way Communication: The protocol supports secure, bidirectional interactions between AI applications (MCP clients) and data sources (MCP servers), facilitating dynamic data exchange.
  • Open-Source Ecosystem: MCP is open-source, encouraging community collaboration and the development of a broad range of integrations and tools.
OpenAI MCP
Source: Microsoft

What is MCP?

Here’s a much simpler, easy-to-understand MCP:

If you’re building with an AI model, you’ve probably run into this:

  • You start with one model one LLM — everything works great.
  • Then your team asks, “Can we add GPT-4o-mini, Mistral, maybe Claude too?”

Now things get messy.

  • Every model has a different API
  • You’re rewriting code just to send prompts
  • Responses look totally different
  • Switching models breaks everything

It’s frustrating and takes way too much time.

That’s where MCP (Model Context Protocol) comes in:

Without MCP

  • Each provider has its own setup (for instance, OpenAI, Mistral, Anthropic)
  • Prompts and responses aren’t consistent
  • Switching models means changing your code again and again

With MCP

  • One simple format for all models
  • Prompts are auto-converted
  • Responses look the same
  • Swap models instantly — no code changes
  • Add new LLMs easily in the future

MCP saves you time, simplifies your code, and makes multi-LLM work way easier.

Also Read: How to Use MCP?

OpenAI’s Integration of MCP

OpenAI’s decision to adopt MCP underscores its commitment to enhancing the functionality and interoperability of its AI products. CEO Sam Altman highlighted the enthusiasm for MCP, stating that support is being integrated across OpenAI’s offerings. The integration is already available in the Agents SDK, with forthcoming support planned for the ChatGPT desktop app and the Responses API.

Implications for OpenAI Products

  • Enhanced Data Access: By leveraging MCP, OpenAI’s AI models can access a wider array of data sources, leading to more informed and contextually relevant responses.
  • Simplified Integrations: Developers can utilize MCP to connect OpenAI’s AI systems with various tools and datasets without the need for bespoke connectors, streamlining the development process.
  • Community Collaboration: OpenAI’s support for an open standard like MCP fosters a collaborative environment, encouraging innovation and shared advancements within the AI community.

Industry Adoption and Future Prospects

Since its inception, MCP has garnered support from various organizations. Companies such as Block, Apollo, Replit, Codeium, and Sourcegraph have integrated MCP into their platforms, recognizing its potential to standardize AI-data interactions.

The adoption of MCP by industry leaders like OpenAI and Microsoft signifies a broader trend towards standardization in AI integrations. As more organizations embrace MCP, the ecosystem is expected to evolve, offering developers a robust framework for building AI applications that can seamlessly interact with diverse data sources.

Implementation of MCP to Get the Info About Git Repository

Here’s how you can use MCP:

Firstly, search for OpenAI Agent SDK and open the Model Context Protocol (MCP).

MCP is an open protocol that standardizes how applications provide context to LLMs. Think of MCP like a USB-C port for AI applications. Just as USB-C provides a standardized way to connect your devices to various peripherals and accessories, MCP provides a standardized way to connect AI models to different data sources and tools.

Let’s begin with the implementation:

I am getting the information about the Langmanus repository and for that, clone this repository in your system and keep the path handy.

Repo path

Clone the repository: openai-agents-python

cloning for OpenAI MCP

Then, put your OpenAI API Key:

export OPENAI_API_KEY: SK-XXXXXX

After this, go to the openai-agents-python directory

cd openai-agents-python/

Then run this command:

uv run python examples/mcp/git_example/main.py

Finally, put the Repository path:

Please enter the path to the git repository: /home/pankaj/langmanus
command

Output

The most frequent contributor is **Henry Li**, with multiple commits in the
history provided.

--------------------------------------------
Running: Summarize the last change in the repository.
The last change in the repository was made by MSc. João Gabriel Lima on March
23, 2025. The commit hash is `646c3e06c4bd58e252967c8b1065c7a0b0f0309b`.

### Commit Message
- **Type:** feat
- **Summary:** ChatLiteLLMV2 missing function (#103)

#### Details:
- Added parameter filtering and supported parameters methods in
ChatLiteLLMV2.
- This change was repeated several times in the commit message details,
highlighting its importance.

Here’s the Main.py

import asyncio
import shutil

from agents import Agent, Runner, trace
from agents.mcp import MCPServer, MCPServerStdio


async def run(mcp_server: MCPServer, directory_path: str):
    agent = Agent(
        name="Assistant",
        instructions=f"Answer questions about the git repository at {directory_path}, use that for repo_path",
        mcp_servers=[mcp_server],
    )

    message = "Who's the most frequent contributor?"
    print("\n" + "-" * 40)
    print(f"Running: {message}")
    result = await Runner.run(starting_agent=agent, input=message)
    print(result.final_output)

    message = "Summarize the last change in the repository."
    print("\n" + "-" * 40)
    print(f"Running: {message}")
    result = await Runner.run(starting_agent=agent, input=message)
    print(result.final_output)


async def main():
    # Ask the user for the directory path
    directory_path = input("Please enter the path to the git repository: ")

    async with MCPServerStdio(
        cache_tools_list=True,  # Cache the tools list, for demonstration
        params={"command": "uvx", "args": ["mcp-server-git"]},
    ) as server:
        with trace(workflow_name="MCP Git Example"):
            await run(server, directory_path)


if __name__ == "__main__":
    if not shutil.which("uvx"):
        raise RuntimeError("uvx is not installed. Please install it with `pip install uvx`.")

    asyncio.run(main())

Also, watch this to understand MCP better:

Conclusion

OpenAI’s adoption of Anthropic’s Model Context Protocol represents a significant advancement in the quest for standardized, efficient, and secure AI-data integrations. By embracing MCP, OpenAI not only enhances the capabilities of its own AI systems but also contributes to the broader movement towards collaborative innovation in the AI industry. As MCP continues to gain traction, it promises to simplify the development of context-aware AI applications, ultimately leading to more intelligent and responsive AI assistants.

Hi, I am Pankaj Singh Negi - Senior Content Editor | Passionate about storytelling and crafting compelling narratives that transform ideas into impactful content. I love reading about technology revolutionizing our lifestyle.

Login to continue reading and enjoy expert-curated content.

Responses From Readers

Clear

We use cookies essential for this site to function well. Please click to help us improve its usefulness with additional cookies. Learn about our use of cookies in our Privacy Policy & Cookies Policy.

Show details