Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.lumiqtrace.com/llms.txt

Use this file to discover all available pages before exploring further.

Wrap your MistralClient with wrapMistral() to trace chat and completion calls.

Installation

npm install @lumiqtrace/sdk @mistralai/mistralai

Setup

import { Mistral } from "@mistralai/mistralai";
import { lumiqtrace } from "@lumiqtrace/sdk";

lumiqtrace.init({ apiKey: process.env.LUMIQTRACE_API_KEY! });

const mistral = lumiqtrace.wrapMistral(new Mistral({ apiKey: process.env.MISTRAL_API_KEY! }));

Example

const response = await mistral.chat.complete({
  model: "mistral-large-latest",
  messages: [{ role: "user", content: "Explain embeddings briefly." }],
});

What gets captured

FieldDetails
Modelmistral-large-latest, mistral-small-latest, codestral-latest, etc.
Input tokensFrom usage.prompt_tokens
Output tokensFrom usage.completion_tokens
CostCalculated from token counts and Mistral pricing
LatencyTotal request duration
Finish reasonstop, length, tool_calls