Skip to main content

OpenTelemetry sink

import { meter, multiSink } from "@amit641/llmmeter";
import { sqliteSink } from "@amit641/llmmeter/sqlite";
import { otelSink } from "llmmeter-otel";
import { trace } from "@opentelemetry/api";

const ai = meter(openai, {
sink: multiSink(
sqliteSink({ filePath: "./.llmmeter/llmmeter.db" }),
otelSink({ tracer: trace.getTracer("my-app") }),
),
});

Emits spans with Gen-AI semantic conventions so your existing Jaeger / Tempo / Honeycomb / Datadog backend gets them automatically.

Span attributes

AttributeSource
gen_ai.systemrecord.provider
gen_ai.operation.namerecord.operation
gen_ai.request.modelrecord.model
gen_ai.usage.input_tokensrecord.tokens.input
gen_ai.usage.output_tokensrecord.tokens.output
gen_ai.usage.cached_input_tokensrecord.tokens.cachedInput (if present)
gen_ai.conversation.idrecord.conversationId
enduser.idrecord.userId
llmmeter.cost_usdrecord.costUsd
llmmeter.featurerecord.feature
llmmeter.ttft_msrecord.ttftMs

Plus two histograms: gen_ai.client.token.usage and gen_ai.client.operation.duration.