Integrating Third Party Agents with ChatGPT Enterprise

With today's LLM systems, users can go beyond just asking a model a question. With ChatGPT Enterprise, company employees can gain access to advanced tooling (e.g. security controls or internal knowledge databases) and integrate these features into their ChatGPT workflow. These extra tools and functions are made possible by integrating third-party agents, like Credal.

This article walks through how third-party agent integration works in ChatGPT Enterprise and how you can integrate Credal to improve the knowledge and functionality of your chat environment.

What is ChatGPT Enterprise?

ChatGPT Enterprise is a version of ChatGPT designed specifically to meet company needs. It includes features like advanced OpenAI models, admin controls (e.g. SSO/SCIM and role-based access), and stronger compliance options (e.g. data residency and enterprise key management).

Beyond better security and access to models, ChatGPT Enterprise allows integration with internal knowledge sources, like documentation and applications. When employees ask questions in ChatGPT, they can receive answers that are specific to the company, not just answers using the open web.

For example, an enterprise admin can integrate with services like SharePoint or GitHub so that employees can reference those sources directly within ChatGPT, while still respecting existing visibility permissions.

What are Connectors in ChatGPT?

In ChatGPT, a connector is an integration that lets the system securely access a third-party application or data source, like Google Drive or GitHub. Using a connector, ChatGPT can search relevant files and reference content from those third-party sources inside the chat interface.

What are custom connectors?

A custom connector uses the Model Context Protocol (MCP), which defines how a remote “tool” (an API or service) can talk to ChatGPT. With an MCP connector you define endpoints, authentication, schemas, and instructions so ChatGPT can use your tool easily.

Custom connectors are useful when your systems are not covered by OpenAI’s out-of-the-box integrations. You might want ChatGPT to trigger your CRM, search through a company-only slide desk, or understand your internal workflow engine. Custom connectors give you flexibility: you can customize ChatGPT to fit your exact needs, moving well beyond the typical “question + answer” setup.

Why Integrate External Agents?

While ChatGPT Enterprise is excellent at reasoning and using company knowledge, many organizations require more than that:

  • Multi-agent workflows: companies might want specialized agents to collaborate (e.g., sales agent, compliance agent, HR agent).
  • Shared memory: shared context across tools and systems means that workflows are repeatable and auditable.
  • Deployment governance: companies may require guardrails, audit logs, and consistent behavior across tools and users.
  • Avoiding vendor lock-in: Many users want to choose different LLMs or agents for different tasks.

Credal is an agent orchestration platform that makes it easy to build secure, scalable, and multi-agent workflows for enterprise needs. A platform like Credal can sit behind ChatGPT as the orchestration layer — ChatGPT is the interface but Credal manages security, workflows, and integrations.

How To Integrate Credal with ChatGPT Enterprise

Today, the most easiest way to connect Credal to ChatGPT Enterprise is to expose Credal’s API through a lightweight MCP server that you host, then register that server as a custom connector. This gives ChatGPT a structured way to call a Credal agent from inside any enterprise chat. The entire setup only takes a few steps!

Step 1 — Enable API access for a Credal agent

In your Credal dashboard, open the agent you want to connect and enable Deploy over API. Copy the agent’s API base URL, agent ID, and API key. These values allow your MCP server to forward instructions to the Credal agent.

Step 2 — Create an MCP server that forwards requests to Credal

Deploy a small server implementing an MCP tool that takes an instruction, sends it to Credal API, and returns the result. Here is a minimal example in Python:

# server.py
import os, requests, json
from mcp import Server, Tool, sse

BASE = os.environ["CREDAL_API_BASE"]
AGENT = os.environ["CREDAL_AGENT_ID"]
KEY  = os.environ["CREDAL_API_KEY"]

def call_credal(text):
    r = requests.post(
        f"{BASE}/agents/{AGENT}/invoke",
        headers={"Authorization": f"Bearer {KEY}"},
        json={"message": text},
    )
    r.raise_for_status()
    data = r.json()
    return data.get("output") or json.dumps(data)

tool = Tool(
    name="credal_run_agent",
    description="Send an instruction to a Credal agent.",
    input_schema={"type": "object", "properties": {"instruction": {"type": "string"}}, "required": ["instruction"]},
    func=lambda p: {"type": "text", "text": call_credal(p["instruction"])},
)

server = Server(name="credal-mcp", version="1.0.0", tools=[tool])

sse.run_server(server, host="0.0.0.0", port=8080, endpoint="/sse", message_endpoint="/messages")

After defining your server, set the following environment variables and run:

export CREDAL_API_BASE="<https://api.credal.ai>"
export CREDAL_AGENT_ID="your-agent-id"
export CREDAL_API_KEY="your-api-key"

python server.py


Deploy the server somewhere reachable via HTTPS for example, https://credal-mcp.company.com

Step 3 — Register the MCP server as a Custom Connector

In ChatGPT Enterprise, go to Settings → Connectors → Add Custom Connector → Remote MCP Server.

Enter the following:

  • Name: Credal MCP
  • Server URL: https://credal-mcp.company.com/sse
  • Message URL: https://credal-mcp.company.com/messages
  • Auth: whatever header or token your MCP server requires

Save and run the built-in test to confirm the connection.

Step 4 — Use Credal in ChatGPT Enterprise

In a new chat, enable the Credal MCP connector in the side panel. When a user gives an instruction that should be handled by Credal, ChatGPT will route the request through the MCP server, call the Credal agent, and return the result in the conversation.

This approach keeps ChatGPT as the interface your employees use, while Credal handles the orchestration, memory, and multi-agent workflows behind the scenes.

Closing thoughts

Credal provides the orchestration, governance, and multi-agent capabilities that many enterprises need, while ChatGPT Enterprise offers a familiar, user-facing interface. Together, they enable agents to execute structured workflows with governance and permission controls, all triggered directly from a chat. This integration creates a single entry point for enterprise AI while preserving flexibility, security, performance, and scale.

Give your team agents to get work done anywhere

Credal gives you everything you need to supercharge your business using generative AI, securely.

Ready to dive in?

Get a demo