MCP with ChatGPT Deep Research Compatibility

R
Renaudil y a 2 jours
0

Description

Run a ChatGPT Deep Research-compatible Model Context Protocol (MCP) server on Vercel with Next.js.

npx boilerapp openai-deep-research-compatible-mcp-with-next-js

文档

Sample MCP Server for ChatGPT Deep Research

This is a sample Model Context Protocol (MCP) server designed to work with ChatGPT's Deep Research feature. It provides semantic search through OpenAI's Vector Store API and document retrieval capabilities, demonstrating how to build custom MCP servers that can extend ChatGPT with company-specific knowledge and tools.

Deploy with Vercel

Features

  • Search Tool: Semantic search using OpenAI Vector Store API
  • Fetch Tool: Complete document retrieval by ID with full content and metadata
  • Sample Data: Includes 5 sample documents covering various technical topics
  • MCP Compliance: Follows OpenAI's MCP specification for deep research integration

Usage

This sample app uses the mcp-handler that allows you to drop in an MCP server on a group of routes in any Next.js project.

Update app/mcp/route.ts with your tools, prompts, and resources following the MCP TypeScript SDK documentation.

Getting Started

  1. Install dependencies:

    pnpm i
    
  2. Run the development server:

    pnpm dev
    

Connecting to ChatGPT Deep Research

  1. Access ChatGPT Settings: Go to ChatGPT settings
  2. Navigate to Connectors: Click on the "Connectors" tab
  3. Add MCP Server: Add your server URL: http://your-domain/mcp
  4. Test Connection: The server should appear as available for deep research

Notes for running on Vercel

Sample Client

script/test-client.mjs contains a sample client to try invocations.

node scripts/test-client.mjs https://mcp-for-next-js.vercel.app

Prix

Gratuit

FREE

评论 (0)

常见问题

常见问题解答 (FAQ)

有问题?我们有答案。如果您找不到想要的答案,请随时联系我们。

Boilerapp 是一个专门用于分享 Boilerplates、入门套件(Starter Kits)和项目模版的开发者社区平台。我们的目标很简单:为您节省初始配置(Setup)的时间,让您可以专注于真正重要的代码。无论您是在寻找简单的代码库还是完整的 SaaS 项目,都能在这里找到。

还有其他问题?

我们的团队随时为您提供帮助。联系我们,我们将尽快回复。