Self-Hosting Guide
Run the ACP Playground on your own infrastructure. This guide covers local development, production deployment options, custom demo fixtures, and security considerations.
Prerequisites
- Node.js 18.18+ -- JavaScript runtime
- npm or yarn -- package manager
- OpenRouter API key -- required for Live mode consensus (get one at openrouter.ai/keys)
Quick Start
Get the playground running locally in under 5 minutes.
Step 1: Navigate to the frontend
cd ACP-PROJECT/frontend/acp-webStep 2: Install dependencies
npm installStep 3: Configure environment (optional)
Create a .env.local file in the frontend/acp-web directory to override defaults:
# Optional: Use your own Cloudflare Worker endpoint
NEXT_PUBLIC_ACP_API_URL=https://your-worker.workers.dev
# Optional: Enable analytics
NEXT_PUBLIC_ENABLE_ANALYTICS=falseBy default, the playground connects to the public ACP API. You can override this to point to your own worker deployment.
Step 4: Run the development server
npm run devVisit http://localhost:3000/playground to access the playground.
Production Deployment
Option 1: Vercel (recommended)
Vercel provides the simplest deployment path for Next.js applications. It automatically detects the framework and configures build settings.
# Install Vercel CLI
npm install -g vercel
# Deploy from the frontend directory
cd frontend/acp-web
vercel deployFollow the interactive prompts to configure your deployment. Set environment variables in the Vercel dashboard:
| Variable | Required | Description |
|---|---|---|
NEXT_PUBLIC_ACP_API_URL | No | Custom API endpoint. Defaults to the public ACP API. |
NEXT_PUBLIC_ENABLE_ANALYTICS | No | Enable or disable analytics tracking. |
Option 2: Docker
Build a production Docker image using a multi-stage build for minimal image size.
FROM node:18-alpine AS deps
WORKDIR /app
COPY package*.json ./
RUN npm ci --production
FROM node:18-alpine AS builder
WORKDIR /app
COPY --from=deps /app/node_modules ./node_modules
COPY . .
RUN npm run build
FROM node:18-alpine AS runner
WORKDIR /app
ENV NODE_ENV production
COPY --from=builder /app/public ./public
COPY --from=builder /app/.next/standalone ./
COPY --from=builder /app/.next/static ./.next/static
EXPOSE 3000
ENV PORT 3000
CMD ["node", "server.js"]# Build the image
docker build -t acp-playground .
# Run the container
docker run -p 3000:3000 acp-playgroundStandalone output
The Dockerfile assumes Next.js standalone output mode is enabled. Add output: "standalone" to your next.config.js if it is not already configured.
Option 3: Traditional Node.js server
Build the application and serve it with the built-in Next.js production server. Use a reverse proxy (Nginx, Caddy) for SSL termination and domain mapping.
# Build for production
npm run build
# Start the production server (port 3000)
npm run startSelf-Hosting the API
By default, the playground connects to the public ACP API. To host your own API backend, deploy the Cloudflare Worker from the repository.
Deploy Cloudflare Worker
cd workers/cloudflare-worker
npm install
# Login to Cloudflare
npx wrangler login
# Set your OpenRouter API key as a secret
npx wrangler secret put OPENROUTER_API_KEY
# Deploy
npx wrangler deployYour worker will be deployed to https://your-worker.your-account.workers.dev. Update the frontend configuration to point to this URL.
Update frontend configuration
NEXT_PUBLIC_ACP_API_URL=https://your-worker.your-account.workers.devRebuild and redeploy the frontend after changing the API URL.
Configure CORS
In workers/cloudflare-worker/wrangler.toml, configure allowed origins to permit your frontend domain:
[vars]
ALLOWED_ORIGINS = "https://your-frontend-domain.com,http://localhost:3000"Custom Demo Fixtures
Add your own demo scenarios to showcase specific use cases in Demo mode. Fixtures are pre-recorded consensus results that do not require API calls.
Create a fixture file
Create a new JSON file in frontend/acp-web/data/demo-fixtures/:
{
"id": "06_custom_scenario",
"title": "Your Custom Scenario",
"description": "Description of your scenario",
"category": "research",
"difficulty": "medium",
"request": {
"query": "Your question here",
"models": ["openai/gpt-5.4", "anthropic/claude-sonnet-4-6"],
"structure": "sonata",
"max_iterations": 5
},
"response": {
"query": "Your question here",
"consensus_reached": true,
"iterations_used": 2,
"final_D": 0.05,
"final_answer": "The consensus answer",
"confidence": 0.95,
"iteration_history": [
{
"iteration": 1,
"D": 0.15,
"responses": [
{
"model": "openai/gpt-5.4",
"content": "Model response",
"extracted": "Answer",
"success": true,
"latency_ms": 500
}
]
}
],
"total_latency_ms": 1500,
"cache_hit": false
}
}Register the fixture
Import and add the fixture in frontend/acp-web/data/demo-fixtures/index.ts:
import fixture06 from "./06_custom_scenario.json";
export const demoFixtures: DemoFixture[] = [
// ... existing fixtures
fixture06 as DemoFixture,
];Rebuild
npm run buildYour custom scenario will now appear in Demo mode.
Configuration Options
API endpoint override
The playground determines the API URL using the following logic:
const API_BASE_URL =
process.env.NEXT_PUBLIC_ACP_API_URL || "https://your-worker.workers.dev";Set NEXT_PUBLIC_ACP_API_URL in .env.local or as a system environment variable to override the default.
API key storage
API keys are stored in the browser's localStorage under the key acp_api_keys:
{
"openrouter": "sk-or-v1-...",
"cloudflare": "..."
}To clear stored keys programmatically:
localStorage.removeItem("acp_api_keys");Alternatively, use the "Clear Keys" button in the playground UI.
Performance Optimization
Code splitting
The playground uses dynamic imports to reduce the initial bundle size. Heavy components are loaded on demand:
const DemoMode = dynamic(
() => import("@/components/playground/DemoMode"),
{
loading: () => <Skeleton />,
ssr: false,
}
);Semantic caching
The Cloudflare Worker includes a semantic cache backed by Vectorize. Queries that are semantically similar to previously cached results return cached responses without incurring additional LLM costs. Enable caching in your worker configuration for faster repeated queries.
CDN for static assets
Deploy static assets (images, fonts, stylesheets) to a CDN for improved load times globally. Both Vercel and Cloudflare Pages handle CDN distribution automatically.
Security Considerations
API keys
- API keys are stored client-side in
localStorage-- they never pass through ACP servers - Keys are sent directly to the LLM provider (OpenRouter or Cloudflare)
- Users retain full control of their keys at all times
- Keys can be cleared at any time via the UI or programmatically
Client-side storage
Because keys are stored in localStorage, they are accessible to JavaScript running on the same origin. Do not use production API keys on shared or public machines. Consider implementing a server-side proxy for environments where client-side key storage is not acceptable.
Rate limiting
The public ACP API enforces rate limits of 120 requests per minute per IP address. Self-hosted workers can configure custom rate limits in the worker code or via Cloudflare's rate limiting rules.
HTTPS
Always deploy to HTTPS in production. API keys and consensus data should never travel over unencrypted connections. Most deployment platforms (Vercel, Cloudflare Pages, Cloudflare Workers) provide free SSL certificates automatically.
Never deploy HTTP in production
Running the playground over HTTP exposes API keys and user queries to network interception. If you are using a traditional Node.js deployment, place it behind a reverse proxy (Nginx, Caddy) that terminates TLS.
Next Steps
- Troubleshooting -- solutions for CORS errors, build failures, and deployment issues
- Worker API Reference -- complete endpoint documentation for the Cloudflare Worker
- Code Examples -- 5 working examples to test with your self-hosted deployment