Configuration

Complete reference for all Axiom Protocol plugin configuration parameters.

Full Configuration Example

openclaw.plugin.json
{
  "workerUrl": "https://your-worker.workers.dev",
  "apiKey": "sk-or-v1-...",
  "defaultModels": [
    "openai/gpt-5.4-mini",
    "anthropic/claude-haiku-4-5",
    "google/gemini-2.5-flash"
  ],
  "maxIterations": 7,
  "structure": "fugue",
  "dThreshold": 0.05
}

Parameters Reference

ParameterTypeRequiredDefaultDescription
workerUrlstringYesAxiom Protocol consensus worker endpoint. Deploy your own from ACP-PROJECT.
apiKeystringYesenv: OPENROUTER_API_KEYOpenRouter API key. Can be set via environment variable or passed directly.
defaultModelsstring[]Nogpt-5.4-mini, claude-haiku-4-5, gemini-2.5-flashDefault model identifiers. Minimum 2 required for consensus.
maxIterationsnumberNo7Maximum convergence iterations (1–7). 7 iterations = 3.4% remaining disagreement.
structureenumNofugueConsensus structure: fugue (layered analysis), sonata (thesis→antithesis→synthesis), or concert (leader + ensemble).
dThresholdnumberNo0.05D-score threshold (0–1). Consensus is reached when D drops below this value.

Worker URL

The workerUrl points to your Axiom Protocol consensus worker. The plugin sends POST requests to {workerUrl}/consensus-iterative.

Deploy the worker from the ACP-PROJECT repository. The worker runs on Cloudflare Workers and handles the iterative consensus algorithm.

Trailing slashes

The plugin automatically strips trailing slashes from the worker URL. Both https://worker.dev and https://worker.dev/ will work correctly.

API Key

The plugin requires an OpenRouter API key to access LLM models. You can provide it in two ways:

  1. Environment variable (recommended): Set OPENROUTER_API_KEY in your environment
  2. Direct config: Pass the key in the apiKey field of the plugin configuration

The key is sent in the x-openrouter-key HTTP header with each consensus request.

Default Models

The defaultModels array specifies which LLMs participate in consensus by default. You need at least 2 models. For best results, use models from different providers to maximize diversity of reasoning.

Recommended models
[
  "openai/gpt-5.4-mini",
  "anthropic/claude-haiku-4-5",
  "google/gemini-2.5-flash"
]

Cross-provider diversity

Models from different providers (OpenAI, Anthropic, Google) have different training data and architectures, making axiom grounding more effective.

Max Iterations

Controls how many convergence rounds the algorithm performs. Each iteration reduces the D-score by the golden ratio (φ ≈ 1.618):

  • 1 iteration: ~61.8% of original disagreement remains
  • 3 iterations: ~23.6% remaining
  • 5 iterations: ~9.0% remaining
  • 7 iterations (default): ~3.4% remaining

Higher iterations give stronger consensus but increase latency and API costs.

Consensus Structure

StructurePatternBest For
fugueModels build on each other in layersComplex questions, technical analysis, deep research
sonataThesis → antithesis → synthesisDebates, ethical dilemmas, competing approaches
concertOne model leads, others refineCreative tasks, brainstorming, content generation

D-Score Threshold

The dThreshold determines when consensus is considered reached. When the D-score drops below this value, the algorithm stops iterating and returns the result.

  • 0.05 (default) — strong consensus required
  • 0.20 — high confidence, faster results
  • 0.40 — moderate agreement acceptable

Timeout

The client uses a default timeout of 120 seconds (120,000ms) per consensus request. This is configurable via the timeoutMs parameter in the client configuration. The timeout uses AbortController for clean cancellation.