Specifies Vercel as the deployment provider. Enables v0.dev as a coding assistant for UI and code generation. Allows for database provisioning (e.g., PostgreSQL, MySQL, or Vercel’s own storage). Optionally, provides authentication and environment variable management.
Vercel Integration: The deployment provider is set to Vercel. Fill in your actual projectId
, teamId
, and API token (these can be generated on your Vercel dashboard).
v0.dev Integration: The AI provider is v0.dev. You’ll need an API key from your v0.dev account.
Database Setup: The configuration provisions a PostgreSQL database using Vercel’s managed Postgres (or you can swap for MySQL, PlanetScale, etc.). The connection string is referenced via an environment variable.
Features:
- Flags to enable auto-deployment, AI code generation, and database provisioning.
- Support for multiple database types: PostgreSQL, MySQL, SQLite, and MongoDB, each with their own provider and environment variable for connection.
- Secrets management for sensitive values (API keys, DB passwords, etc.).
- Multi-environment support for development, staging, and production, allowing you to override environment variables and secrets per environment.
- Provide clear placeholders and structure so you can easily fill in your actual credentials.
This configuration enables flexible deployment, secure secrets handling, and easy switching between environments.
le Overview project: Describes your project. providers: Sets up deployment (Vercel), AI (v0.dev), and multiple database types (Postgres, MySQL, SQLite, MongoDB). features: Enables advanced workflow features (auto-deploy, AI code generation, secrets management, multi-environment support). secrets: Stores sensitive info (API keys, DB URLs). environments: Lets you define dev, staging, and production settings and secrets. instructions: Human-readable workflow steps. mcpServers: Defines custom MCP servers (like Puppeteer and Figma). Specifies how to start them and their connection addresses: Puppeteer MCP: http://localhost:4001 Figma MCP: http://localhost:4002 These ports are available and do not conflict with your local web development on port 5173. What you can do with this file Add/Remove Custom MCP Servers: Integrate tools for browser automation, design, etc. Connect Cascade to Local MCPs: Cascade will use the address fields to connect to your running servers. Switch Environments: Seamlessly move between dev/staging/prod with different secrets and DBs. Provision Databases: Instantly connect to different DBs as needed. Automate Deployments: Push to Vercel or other providers. Securely Manage Secrets: Keep sensitive info out of your codebase. Next Steps Start your MCP servers on the assigned ports:
For Puppeteer:
npx -y @modelcontextprotocol/server-puppeteer --port 4001
For Figma:
npx figma-mcp --port 4002
In Cascade/Windsurf, use the "Refresh" option in the MCP server UI. Your servers should now appear and be connectable.
You can now orchestrate complex workflows, use AI code generation, automate deployments, and connect with external tools—all from this config!
need to install or run the MCP server separately.
Here’s how you can proceed to get your local MCP server running:
-
Check for a Global MCP CLI Many MCP servers (for Windsurf/Cascade) are installed globally via npm. The most common commands are:
npx mcp-server start # or npx windsurf-mcp start # or, if installed globally mcp-server start # or windsurf-mcp start
If you get a “command not found” error, you’ll need to install it.
-
Install MCP Server (if not present) Try installing the MCP server globally:
npm install -g windsurf-mcp # or npm install -g mcp-server
-
Start the MCP Server Once installed, run:
windsurf-mcp start # or mcp-server start
This should start your MCP server on a local port (commonly 3000 or 4000).
-
Add MCP Server to Windsurf/Cascade
- Go to Windsurf Settings → Cascade → Add Server.
- Enter the address, e.g.
http://localhost:3000
- Click Refresh.
Copy the content in the vercelv0mcp.json
with your own values for the placeholders to your mcp_config.json
file (usually located in .codeium/windsurf/mcp_config.json
; location may differ in other IDEs). The MCP configuration here works for all IDEs when placed in the right directory/file.
o fully utilize Vercel and v0.dev with your local repo in Windsurf, you need to ensure:
Vercel and v0.dev are properly configured as providers in your config (which you already have in the providers section). You have valid API keys/tokens for both services in your secrets (and optionally per-environment). You do NOT need to run local MCP servers for Vercel and v0.dev—these are cloud services, not local processes like Puppeteer or Figma MCP. Instead, your MCP or Cascade agent will use the API keys to connect to their APIs. How it works in your config Vercel: The deployment provider uses your Vercel project and team IDs, plus your API token. When you trigger a deploy, the MCP or Cascade agent will use these credentials to deploy your code to Vercel. v0.dev: The ai provider uses your v0.dev API key. When you request code or UI generation, the MCP/Cascade agent will use this key to access v0.dev’s AI APIs. You do not need to add an address for these cloud services in mcpServers! Instead, just ensure your API keys and config are correct.
What you need to do
Replace the placeholders in your secrets
section with your real API keys:
"secrets": {
"VERCEL_TOKEN": "<your-vercel-api-token>",
"V0_DEV_API_KEY": "<your-v0-dev-api-key>",
// ...
}
- You can get your Vercel API token from your Vercel dashboard.
- You can get your v0.dev API key from your v0.dev account.
- (Optional) Set environment-specific secrets if you want different keys for dev/staging/prod.
- Make sure your providers section has the correct IDs for your Vercel project and team.
hen using v0.dev as an MCP (Model Context Protocol) provider, the typical workflow for seeing changes in preview mode on their AI IDE view is as follows:
How v0.dev Preview Works Edit or Generate Code/UI with v0.dev You use v0.dev (via their web IDE or via integration in Windsurf/Cascade) to generate or edit code/components. Preview Mode in v0.dev In the v0.dev web interface, there is a “Preview” button or panel. When you make changes (either by AI prompt or manual edit), you can click “Preview” to see a live rendering of the UI/component. This preview is rendered in a sandboxed environment using the code/context you see in the editor. Integration with Local Repo (Optional) If you have connected your local repo to v0.dev (using their GitHub integration or file sync), changes made in v0.dev can be pushed to your repo, and vice versa. If you are using Windsurf/Cascade with v0.dev as an AI provider, you can generate code locally and then push to your repo or deploy to Vercel for a full-stack preview. Live Preview in Windsurf/Cascade If you want to view your changes live locally (e.g., with Vite or Next.js), run your local dev server (e.g., npm run dev) and open the local preview (e.g., http://localhost:5173). v0.dev’s preview is isolated to their UI, but you can always copy code from v0.dev to your local repo and see it live in your own browser. Key Points v0.dev’s “Preview” is for the code/context in their web IDE. To see a live preview of your full app, run your local dev server and open the browser at your local port (e.g., 5173). 6F85 If you want to sync changes between v0.dev and your local repo, use their GitHub/file sync integration. Practical Workflow In v0.dev: Generate/edit code → Click “Preview” to see the result in their AI IDE. In your local repo (Windsurf): Pull or copy the generated code into your repo. Run your local dev server (npm run dev or similar). Open the local preview URL (e.g., http://localhost:5173). For full-stack preview: Deploy to Vercel and use the Vercel preview URL.
-
Why You Don’t See an Address Field When you click "Add Custom Server" in Cascade settings, it takes you to this config file because Cascade expects you to define custom MCP servers here. However, the "mcpServers" section currently only specifies how to start servers (the command and args), not their addresses.
To make servers available to Cascade, you need to specify their addresses (URLs) so the IDE knows how to connect to them.
-
How to Fix the File Add an
address
(orurl
) field to each server undermcpServers
. For example:"mcpServers": { "puppeteer": { "command": "npx", "args": [ "-y", "@modelcontextprotocol/server-puppeteer" ], "address": "http://localhost:4001" }, "figma": { "command": "npx", "args": [ "figma-mcp" ], "env": { "FIGMA_API_KEY": "kQb1kSFNQtXkWnlQ7AVVpQH5BqVxwP" }, "address": "http://localhost:4002" } }
- You can get your Vercel API token from your Vercel dashboard.
- You can get your v0.dev API key from your v0.dev account.
- (Optional) Set environment-specific secrets if you want different keys for dev/staging/prod.
- Make sure your providers section has the correct IDs for your Vercel project and team.
Push changes to Vercel or other providers automatically.
so If i leave placeholder values in the mcp config file but ensure that the real environment variable are present in the .env everything will work, is that right?
MCP Config File (mcp3.json): Can contain placeholder values for sensitive information Defines which environment variables your application needs Specifies the structure and relationships between services Environment Variables (.env): Contains the actual values that will be used at runtime Overrides any placeholder values from the MCP config Keeps sensitive information out of version control When your application runs:
The system first loads the MCP configuration Then it loads the actual values from your .env file The real values from .env replace any placeholders in the configuration This approach has several benefits:
Security: Sensitive data stays in your local .env file (which should never be committed to version control) Portability: Other developers can use the same MCP config but with their own .env values Flexibility: You can have different .env files for different environments (development, staging, production) For example, your MCP config might have "V0_DEV_API_KEY": "V0_DEV_API_KEY_PLACEHOLDER", but as long as your .env file has V0_DEV_API_KEY=your_actual_api_key, the system will use your actual API key when needed.
This is a standard practice in modern development and follows the 12-factor app methodology for configuration management.