Documentation Index
Fetch the complete documentation index at: https://docs.lumiqtrace.com/llms.txt
Use this file to discover all available pages before exploring further.
SDK v1.5.0
Released: April 2026New
wrapBedrock()wrapper for AWS BedrockconverseAPIwrapMistral()wrapper for Mistral AIwrapGroq()wrapper for GroqwrapLiteLLM()helper for LiteLLM proxydisableConfigSyncoption to opt out of remote config propagation
Improved
wrapGoogle()now capturescachedContentTokenCountandthoughtsTokenCountfrom Gemini 2.5 responsesLumiqtraceCallbackHandlerPython handler now supports async LangChain chains natively- Flush latency reduced by ~30% via HTTP/2 keep-alive reuse
- Fixed a race condition in the batch engine that could cause duplicate events on process exit under high load
- Fixed TTFT not being captured for Anthropic streaming responses when
stream_optionswas not set - Fixed Python
with_agentcontext manager not propagating to nested sync functions
SDK v1.4.2
Released: March 2026 Bug fixes:- Fixed
cost_usdbeing computed asNaNfor models with zero-cost cached tokens whencached_tokenswasundefined - Fixed
wrapADKRunnernot capturing tool spans when the ADK tool result was an empty string - Fixed TypeScript types for
withLumiqtraceContextnot acceptingPromise<void>return type
SDK v1.4.0
Released: February 2026New
- Config propagation — SDK now reads
cvfrom ingest responses and fetches updated config from/v1/sdk/configautomatically replaceDefaultRedactKeysoption for PII redactionPromptClient.clearCache(name?)— clear cache for a specific prompt or all prompts- Google ADK:
instrumentADK()convenience function wraps agent + runner in one call
Improved
wrapGoogle()now supports the new Gemini 2.5thoughtsTokenCountfield for reasoning token trackingLumiqtraceCallbackHandlerfor LangChain now automatically propagatesuserIdandsessionIdto all nested LLM spans- Python SDK:
patch_openai()now patchesAsyncOpenAIstreaming responses for TTFT capture
SDK v1.3.0
Released: January 2026 New:wrapOpenRouter()— OpenRouter wrapper using OpenAI SDK interfaceinitOTel()— standalone OpenTelemetry setup functionenableOTelandotelEndpointoptions ininit()for dual-export to OTEL + LumiqTrace- Python:
@lumiqtrace.tracedecorator for sync and async functions
withAgentnow accepts an optionalagentIdparameter for stable agent identity across runs- Batch engine now uses gzip compression by default (was opt-in), reducing payload size by ~75%
Platform v2.3.0
Released: April 2026 New features:- LumiqPilot config propagation — Pilot can now push SDK config changes (model overrides, sample rate, guardrail toggles) to running instances in real time
- Incident auto-remediation — Scale plan: configure remediation rules that Pilot applies automatically when an incident opens
- Simulation batch runs — compare multiple prompt versions in parallel against the same dataset
- Session view — new dashboard page grouping LLM calls by
session_id
- Anomaly detection now uses 14-day rolling baseline (was 7 days) for fewer false positives
- Cost Optimizer refresh cadence: Scale plan moved from daily to near-real-time (sub-1-hour)
- Trace flame graph now renders
guardrailspans inline with their parent LLM span
- Fixed alert history table not loading for projects with >10K alert events
- Fixed
POST /v1/otelaccepting spans withoutgen_ai.*attributes but dropping them silently — they now appear in traces as generic spans
Platform v2.2.0
Released: February 2026 New features:- Evaluations dashboard — LLM-as-judge evaluator definitions, auto-run mode, trend charts, per-trace scores
- Simulations — dataset management, scenario configuration, batch runs with side-by-side comparison
- Guardrails dashboard — execution history, per-policy trigger frequency, AI-powered guardrail types (toxicity, prompt injection, custom judge)
- Prompt management dashboard — version history, label management, protected labels, dependency graph
- NLQ result summaries now use streaming for sub-second time-to-first-word
- Dashboard overview page loads 2× faster via improved ClickHouse query plans
SDK v1.2.0
Released: December 2025 New:GuardrailBlockedErrorclass withphase,resultsproperties- Guardrails support on
wrapAnthropic,wrapGoogle, andLumiqtraceCallbackHandler PromptClient— fetch, compile, create, and manage versioned prompts at runtime- Python:
PromptClientwith same API as TypeScript version
wrapOpenAI(client, options)second argument changed fromWrapOptions | undefinedto a full options object. If you were passingundefined, no change needed.
SDK v1.0.0
Released: October 2025 Initial public release.initLumiqtrace()/lumiqtrace.init()singleton initializationwrapOpenAI(),wrapAnthropic(),wrapGoogle()provider wrapperswithLumiqtraceContext()— async context propagationwithAgent()— multi-agent tracing with plan, tool, and handoff spansstartSpan()— manual span API with eval score attachmentLumiqtraceCallbackHandlerfor LangChain (TypeScript + Python)wrapADKAgent()andwrapADKRunner()for Google ADK- PII redaction with configurable key list
- Batch engine with gzip NDJSON transport
- Python SDK parity: all TypeScript features available in Python