ChatGPT Connectors: The Emergence of a New Interface Layer

ChatGPT just got a new superpower: connectors. Real-time access to your tools like Drive, HubSpot and Notion—without uploading a thing.

ChatGPT Connectors: The Emergence of a New Interface Layer
Connecting different worlds

The recent launch of Connectors in ChatGPT Business and Enterprise plans marks a pivotal shift in how we interact with large data stores.

With a quiet rollout that deserves more attention, OpenAI is redefining what it means to “chat with your data”—this time, not just through uploads or memory, but via direct integrations into tools like Google Drive, OneDrive, SharePoint, Slack, Salesforce, Notion, and more.

It’s early days, and the dust hasn’t settled. But already, the implications are striking.

What Are ChatGPT Connectors?

Connectors act as live data bridges between ChatGPT and third-party sources. If you’re used to tools like NotebookLM, or if you’ve worked with custom RAG (retrieval-augmented generation) setups via APIs or plugins, this feels like a native, out-of-the-box version of that concept—MCP-like pipelines for mainstream users.

Think of connectors as permission-based extensions. Once enabled (on business-tier accounts), ChatGPT gains real-time query access to selected data silos—no need to upload, no need to preprocess. From a UX standpoint, it feels like talking to a hyperintelligent assistant who can, say, read all your Notion docs, access your latest project folders on Drive, or summarise your last five HubSpot deals.

Connector to Google Calendar.

What’s Actually New Here?

On the surface: seamless access. Under the hood: possibly RAG, layered access control, and a completely different mental model. ChatGPT moves from being a discrete, self-contained assistant to a kind of context-aware gateway—sensitive to documents, schedules, and histories it couldn’t see before.

The shift is conceptual as well as technical. Where we once prompted ChatGPT with all we could fit in a single session (or document upload), we’re now stepping into a space where the model pulls from dynamic, evolving knowledge bases.

⚠️
Connectors in ChatGPT are available to users on Plus, Pro, Team, Edu, and Enterprise plans, but regional restrictions apply. For users in the EEA, Switzerland, and the UK, most connectors—including Dropbox, Box, OneDrive, and SharePoint—are not currently available to those on the Plus and Pro plans, regardless of whether these connectors are listed as supported. Only Team, Edu, and Enterprise users in those regions have access. GitHub is the exception and is available to all users globally, including those in restricted regions.

Opportunities—and Overwhelm

The ability to perform deep, unstructured research across your cloud environments is incredible. But it's also disorienting.

“I always saw ChatGPT as a standalone tool. That’s what I liked about it, now suddenly, stuff flows in left and right—it seems. That’s the thing I need to get my head around.”

This feeling is real, especially for users who appreciate clarity and modularity. With connectors, the scope of the assistant expands beyond what’s visible in a chat window. There’s power in that—but also opacity. Where’s the boundary? What’s cached, indexed, live? These questions aren’t yet answered clearly in OpenAI’s UX or documentation.

ChatGPT Connector working with DeepResearch and Google Drive

Use Cases Emerging

The most immediate value comes from:

  • Large-scale document querying – Instead of uploading PDFs or using external tools like NotebookLM, you can now ask questions across your entire Drive or SharePoint.
  • Sales & CRM automation – Imagine surfacing customer insights in natural language from within your HubSpot or Salesforce accounts.
  • Meeting & knowledge summarisation – Using connectors to pull context from Notion, Slack, or Google Calendar to generate summaries, next steps, or briefs.

In short: Connectors blur the line between “AI assistant” and “business operating system.”

A New Interface Layer

It’s worth stating: this isn’t just a new feature. It’s the emergence of a new interface layer—one that makes our cloud-based lives machine-readable, conversational, and dynamically retrievable.

The implications are architectural. This could rival what Microsoft is doing with Copilot and Graph-based integrations. The key difference? ChatGPT connectors don’t tie you to a specific productivity stack. They’re intentionally broad and flexible.

What Comes Next?

OpenAI will likely expand connectors beyond the current business-only boundary. A natural evolution would include:

  • Connector chaining (e.g. “Fetch customer info from HubSpot, cross-check calendar availability, then draft an email”)
  • Fine-grained data permissions and previews
  • User-managed caching or data snapshots for speed & control

For now, experimentation is key. Clip what you see. Document behaviours. And prepare to rethink what it means to query.

I give a quick tour on ChatGPT and Connectors