Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.lumiqtrace.com/llms.txt

Use this file to discover all available pages before exploring further.

Wrap your BedrockRuntimeClient with wrapBedrock() to trace all model invocations through the Bedrock API.

Installation

npm install @lumiqtrace/sdk @aws-sdk/client-bedrock-runtime

Setup

import { BedrockRuntimeClient } from "@aws-sdk/client-bedrock-runtime";
import { lumiqtrace } from "@lumiqtrace/sdk";

lumiqtrace.init({ apiKey: process.env.LUMIQTRACE_API_KEY! });

const bedrock = lumiqtrace.wrapBedrock(new BedrockRuntimeClient({ region: "us-east-1" }));

Example

import { InvokeModelCommand } from "@aws-sdk/client-bedrock-runtime";

const response = await bedrock.send(
  new InvokeModelCommand({
    modelId: "anthropic.claude-sonnet-4-6-v1",
    body: JSON.stringify({
      anthropic_version: "bedrock-2023-05-31",
      max_tokens: 512,
      messages: [{ role: "user", content: "Summarize observability in two sentences." }],
    }),
    contentType: "application/json",
  })
);

What gets captured

FieldDetails
ModelBedrock model ID (e.g. anthropic.claude-sonnet-4-6-v1)
Input tokensParsed from the model response body
Output tokensParsed from the model response body
CostCalculated from token counts and Bedrock pricing
LatencyTotal invocation duration
Token extraction requires the model response body to include usage metadata. All Anthropic and Llama models on Bedrock return this. Check your model’s response schema if tokens show as zero.