Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.lumiqtrace.com/llms.txt

Use this file to discover all available pages before exploring further.

OpenRouter provides a unified API for 200+ models. Wrap your OpenRouter client with wrapOpenRouter() to trace every call, regardless of which underlying model is used.

Installation

npm install @lumiqtrace/sdk openai
OpenRouter uses an OpenAI-compatible API. Install openai as the client.

Setup

import OpenAI from "openai";
import { lumiqtrace } from "@lumiqtrace/sdk";

lumiqtrace.init({ apiKey: process.env.LUMIQTRACE_API_KEY! });

const openrouter = lumiqtrace.wrapOpenRouter(
  new OpenAI({
    baseURL: "https://openrouter.ai/api/v1",
    apiKey: process.env.OPENROUTER_API_KEY!,
  })
);

Example

const response = await openrouter.chat.completions.create({
  model: "anthropic/claude-sonnet-4-6",
  messages: [{ role: "user", content: "What model are you?" }],
});

What gets captured

FieldDetails
ModelThe OpenRouter model identifier (e.g. anthropic/claude-sonnet-4-6)
Input tokensFrom OpenRouter’s usage response field
Output tokensFrom OpenRouter’s usage response field
CostFrom OpenRouter’s reported cost or calculated from tokens
LatencyTotal request duration
Cost reporting accuracy depends on whether OpenRouter returns cost data in the response. For most popular models it does. For custom or fine-tuned routes, token-based calculation is used as a fallback.