Protected
accountProtected
Optional
appProtected
environmentProtected
environmentProtected
heliconeProtected
Readonly
heliconeProtected
Readonly
heliconeProtected
Readonly
heliconeProtected
nvmProtected
returnProtected
Optional
versionHelper function to calculate usage for dummy song operations
Optional
completion_Optional
prompt_Helper function to calculate usage for image operations based on pixels
Optional
completion_Optional
prompt_Helper function to calculate usage for song operations based on tokens/quota
Optional
completion_Optional
prompt_Helper function to calculate usage for video operations (typically 1 token)
Optional
completion_Optional
prompt_Creates a standardized Helicone payload for API logging
Creates a standardized Helicone response for API logging
Protected
getProtected
parseCreates an async logger with Nevermined logging enabled and automatic property injection
This method wraps the OpenTelemetry SpanProcessor to automatically add all Helicone properties to every span. This mimics Python's Traceloop.set_association_properties() - no wrapping of individual LLM calls needed!
AI SDK modules to instrument (OpenAI, Anthropic, etc.)
The agent request for logging purposes
Optional
customProperties: CustomPropertiesCustom properties to add as Helicone headers
The async logger instance with init() method
import OpenAI from 'openai';
const logger = observability.withAsyncLogger({ openAI: OpenAI }, agentRequest);
logger.init();
// Make LLM calls normally - properties are automatically added to all spans!
const openai = new OpenAI({ apiKey });
const result = await openai.chat.completions.create({ ... });
Wraps an async operation with Helicone logging
Name of the agent for logging purposes
Configuration for the Helicone payload
The async operation to execute (returns internal result with extra data)
Function to extract the user-facing result from internal result
Function to calculate usage metrics from the internal result
Optional
completion_Optional
prompt_Prefix for the response ID
The agent request for logging purposes
Custom properties to add as Helicone headers (should include agentid and sessionid)
Promise that resolves to the extracted user result
Creates a ChatOpenAI configuration with logging enabled
Usage: const llm = new ChatOpenAI(observability.withLangchain("gpt-4o-mini", apiKey, agentRequest, customProperties));
The OpenAI model to use (e.g., "gpt-4o-mini", "gpt-4")
The OpenAI API key
Custom properties to add as Helicone headers (should include agentid and sessionid)
Configuration object for ChatOpenAI constructor with logging enabled
Creates an OpenAI client configuration with logging enabled
Usage: const openai = new OpenAI(observability.withOpenAI(apiKey, heliconeApiKey, agentRequest, customProperties));
The OpenAI API key
The agent request for logging purposes
Custom properties to add as Helicone headers (should include agentid and sessionid)
Configuration object for OpenAI constructor with logging enabled
Static
getThis method is used to create a singleton instance of the ObservabilityAPI class.
The options to initialize the payments class.
The instance of the ObservabilityAPI class.
The ObservabilityAPI class provides methods to wrap API calls with Helicone logging