Skip to main content

Quickstart: Using MCP with Vercel AI SDK

This guide shows you how to integrate Bundleport's MCP server with the Vercel AI SDK to build AI-powered hotel booking experiences.

Prerequisites

  • Node.js 18+ installed
  • A Bundleport API key and configured access IDs
  • MCP server running (default: https://api.connect.bundleport.com/mcp)

Installation

Install the required dependencies:

npm install ai @ai-sdk/openai

Basic Setup

1. Create an MCP Client

Use the experimental_createMCPClient function to connect to the Bundleport MCP server:

import { experimental_createMCPClient as createMCPClient } from 'ai';

const mcpClient = await createMCPClient({
transport: {
type: 'sse',
url: 'https://api.connect.bundleport.com/mcp', // Your MCP server URL
},
});

2. Generate Text with MCP Tools

Incorporate the MCP tools into your text generation:

import { generateText } from 'ai';
import { openai } from '@ai-sdk/openai';

const { text, toolCalls, toolResults } = await generateText({
model: openai('gpt-4o'),
tools: await mcpClient.tools(),
prompt: 'Find available hotels in Barcelona for 2 nights starting February 14th, 2025',
});

Complete Example: Hotel Booking Workflow

Here's a complete example that demonstrates the full booking workflow with automatic accessIds injection:

import { generateText } from 'ai';
import { openai } from '@ai-sdk/openai';
import { experimental_createMCPClient as createMCPClient } from 'ai';

// Helper function to inject accessIds
function injectAccessIds(tools, accessIds) {
return Object.fromEntries(
Object.entries(tools).map(([toolName, tool]) => [
toolName,
{
...tool,
execute: async (params) => {
const settings = {
...params.settings,
accessIds: params.settings?.accessIds || accessIds,
};
return tool.execute({ ...params, settings });
},
},
])
);
}

async function bookHotel() {
// Create MCP client
const mcpClient = await createMCPClient({
transport: {
type: 'sse',
url: 'https://api.connect.bundleport.com/mcp',
},
});

// Get tools and inject accessIds automatically
const rawTools = await mcpClient.tools();
const tools = injectAccessIds(rawTools, ['HBDS2_D']);

// Step 1: Search for hotels (accessIds injected automatically)
const searchResult = await generateText({
model: openai('gpt-4o'),
tools: tools,
toolChoice: 'required',
prompt: `Search for available hotels in Barcelona, Spain.
Check-in: 2025-02-14
Check-out: 2025-02-17
Adults: 2`,
});

console.log('Search results:', searchResult.text);
console.log('Tool calls:', searchResult.toolCalls);

// Step 2: Get a quote for a selected option
// The AI agent should extract optionRefId from the search results
const quoteResult = await generateText({
model: openai('gpt-4o'),
tools: tools,
toolChoice: 'required',
prompt: `Get a detailed quote for option OPT-123456789`,
});

console.log('Quote:', quoteResult.text);

// Step 3: Book the hotel
const bookResult = await generateText({
model: openai('gpt-4o'),
tools: tools,
toolChoice: 'required',
prompt: `Book the hotel with optionRefId OPT-123456789.
Holder: John Doe, john@example.com
Travellers: John Doe (age 35), Jane Doe (age 32)
Payment: Visa ending in 1111, expiry 12/2027`,
});

console.log('Booking confirmation:', bookResult.text);
}

bookHotel().catch(console.error);

Streaming Responses

For real-time user experiences, use streaming:

import { streamText } from 'ai';
import { openai } from '@ai-sdk/openai';
import { experimental_createMCPClient as createMCPClient } from 'ai';

async function streamHotelSearch() {
const mcpClient = await createMCPClient({
transport: {
type: 'sse',
url: 'https://api.connect.bundleport.com/mcp',
},
});

const tools = await mcpClient.tools();

const result = streamText({
model: openai('gpt-4o'),
tools: tools,
prompt: 'Find hotels in Paris for next weekend',
});

for await (const chunk of result.textStream) {
process.stdout.write(chunk);
}
}

Using with Next.js API Routes

Here's how to integrate MCP tools in a Next.js API route:

// app/api/chat/route.js
import { streamText } from 'ai';
import { openai } from '@ai-sdk/openai';
import { experimental_createMCPClient as createMCPClient } from 'ai';

export async function POST(req) {
const { messages } = await req.json();

// Create MCP client
const mcpClient = await createMCPClient({
transport: {
type: 'sse',
url: process.env.MCP_SERVER_URL || 'https://api.connect.bundleport.com/mcp',
},
});

// Get MCP tools and wrap them to inject accessIds automatically
const rawTools = await mcpClient.tools();
const tools = Object.fromEntries(
Object.entries(rawTools).map(([toolName, tool]) => [
toolName,
{
...tool,
execute: async (params) => {
const settings = {
...params.settings,
accessIds: params.settings?.accessIds || ['HBDS2_D'],
};
return tool.execute({ ...params, settings });
},
},
])
);

// Generate streaming response
const result = streamText({
model: openai('gpt-4o'),
tools: tools,
messages: messages,
system: `You are a helpful hotel booking assistant.
Use the available tools to search for hotels, get quotes, and make bookings.`,
});

return result.toDataStreamResponse();
}

Tool Parameters

The MCP tools require an accessIds array in the settings parameter. Instead of requiring the AI agent to provide this in every tool call, you can use middleware to automatically inject it.

Automatic Parameter Injection with Middleware

You can create a reusable helper function to wrap the MCP tools and automatically inject accessIds:

function injectAccessIds(tools, accessIds) {
return Object.fromEntries(
Object.entries(tools).map(([toolName, tool]) => [
toolName,
{
...tool,
execute: async (params) => {
const settings = {
...params.settings,
accessIds: params.settings?.accessIds || accessIds,
};
return tool.execute({ ...params, settings });
},
},
])
);
}

// Usage
const mcpClient = await createMCPClient({
transport: {
type: 'sse',
url: 'https://api.connect.bundleport.com/mcp',
},
});

const rawTools = await mcpClient.tools();
const tools = injectAccessIds(rawTools, ['HBDS2_D']);

const result = await generateText({
model: openai('gpt-4o'),
tools: tools,
prompt: 'Search for hotels in Paris',
});

With this middleware approach, the AI agent doesn't need to specify accessIds in every tool call—it's automatically injected, making your prompts cleaner and reducing the chance of errors.

Error Handling

The MCP server returns structured error responses. Handle them in your application:

const result = await generateText({
model: openai('gpt-4o'),
tools: tools,
prompt: 'Search for hotels...',
});

if (result.toolResults) {
for (const toolResult of result.toolResults) {
if (toolResult.result?.errors) {
console.error('Tool errors:', toolResult.result.errors);
}
}
}

Best Practices

  1. System Prompts: Provide clear system prompts that guide the AI to use tools appropriately
  2. Context Management: The MCP server maintains context via the context field - use it to chain operations
  3. Error Recovery: Implement retry logic for transient failures
  4. Token Optimization: The server returns condensed, AI-optimized responses to minimize token usage
  5. Access ID Configuration: Ensure your access IDs are properly configured in Bundleport before making tool calls

Next Steps