Skip to content

Cognipeer ConsoleRun Multi-Tenant AI Infrastructure Without Rebuilding The Control Plane

Operate inference, vector stores, tracing, guardrails, RAG, config, and incident workflows behind one production-ready console with tenant isolation built in.

Start Here

If you are evaluating or onboarding Cognipeer Console, this is the shortest useful reading order:

  1. Getting Started to boot the platform locally.
  2. Architecture to understand the runtime split between UI, API plugins, and core services.
  3. Core Overview to see how config, logging, cache, resilience, and request context fit together.

If you already know the basics, jump directly to the part that matches your work:

Choose Your Entry Point

Start withBest forWhat you get
GuideTeams onboarding the platform for the first timeLocal setup, architecture, core module docs, and operational guidance
API ReferenceSDK authors and integratorsEndpoint behavior, request and response models, and OpenAI-compatible surface details
Core ModulesPlatform engineers extending the runtimeThe shared infrastructure primitives that shape behavior across every domain service
Console SDK DocsApplication developers integrating from TypeScript or JavaScriptClient initialization, resource methods, examples, and framework integrations

Quick Start

bash
npm install
cp .env.example .env.local
npm run dev
bash
docker build -t cognipeer-console .
docker run -p 3000:3000 --env-file .env.local cognipeer-console

Docs Map

  • Guide: setup, architecture, deployment, providers, and feature walkthroughs.
  • Core Modules: config, request context, cache, resilience, runtime pool, health, lifecycle, and CORS.
  • API Reference: gateway endpoints for chat, embeddings, agents, tools, tracing, vector, RAG, files, and health.
  • Using the SDK: where Console stops, where the SDK starts, and how teams should split responsibility.
  • Licensing: AGPL community terms, commercial options, and redistribution guidance.
  • Security: secret handling, hardening defaults, and vulnerability disclosure workflow.
  • Contributing: development rules, validation steps, and docs workflow notes.

Production Checklist

  • Confirm tenant identity, feature policy checks, and request context propagation are enforced on every new API surface.
  • Decide which providers, vector backends, and storage systems must be available in your target environment before onboarding teams.
  • Validate health, cache, resilience, and lifecycle behavior up front instead of treating them as optional infrastructure later.
  • Map your docs updates to the right guide or API page when new modules or endpoints are introduced.
  • Keep docs local verification in the release loop with npm run docs:build.

What This Site Covers

  • A platform-level view of Cognipeer Console as a multi-tenant AI control plane rather than a loose collection of feature modules.
  • The runtime contracts behind inference, tracing, vector, guardrails, files, prompts, and RAG workflows.
  • The shared core infrastructure that keeps request handling, logging, caching, resilience, and shutdown behavior predictable.
  • A docs shell aligned with the agent-sdk and chat-ui surfaces while keeping Cognipeer Console's own information architecture.

Community edition is AGPL-3.0. Commercial licensing and support are available separately.