Skip to content

cliproxyapi++OpenAI-Compatible Multi-Provider Gateway

One API surface for routing across heterogeneous model providers

cliproxyapi++ Docs

cliproxyapi++ is an OpenAI-compatible proxy that routes one client API surface to multiple upstream providers.

Who This Documentation Is For

  • Operators running a shared internal LLM gateway.
  • Platform engineers integrating existing OpenAI-compatible clients.
  • Developers embedding cliproxyapi++ in Go services.
  • Incident responders who need health, logs, and management endpoints.

What You Can Do

  • Use one endpoint (/v1/*) across heterogeneous providers.
  • Configure routing and model-prefix behavior in config.yaml.
  • Manage credentials and runtime controls through management APIs.
  • Monitor health and per-provider metrics for operations.

Start Here

  1. Getting Started for first run and first request.
  2. Install for Docker, binary, and source options.
  3. Provider Usage for provider strategy and setup patterns.
  4. Provider Quickstarts for provider-specific 5-minute success paths.
  5. Provider Catalog for provider block reference.
  6. Provider Operations for on-call runbook and incident workflows.
  7. Routing and Models Reference for model resolution behavior.
  8. Troubleshooting for common failures and concrete fixes.
  9. Planning Boards for source-linked execution tracking and import-ready board artifacts.

API Surfaces

Audience-Specific Guides

  • Docsets for user, developer, and agent-focused guidance.
  • Feature Guides for deeper behavior and implementation notes.
  • Planning Boards for source-to-solution mapping across issues, PRs, discussions, and external requests.

Fast Verification Commands

bash
# Basic process health
curl -sS http://localhost:8317/health

# List models exposed by your current auth + config
curl -sS http://localhost:8317/v1/models | jq '.data[:5]'

# Check provider-side rolling stats
curl -sS http://localhost:8317/v1/metrics/providers | jq

MIT Licensed