Complete reference for all public classes, methods, and functions in the Infinium Python SDK.


Clients

InfiniumClient

Synchronous client for the Infinium API.

from infinium import InfiniumClient

Constructor

InfiniumClient(
    agent_id: str,
    agent_secret: str,
    base_url: str = "https://platform.i42m.ai/api/v1",
    timeout: float = 30.0,
    max_retries: int = 3,
    enable_rate_limiting: bool = True,
    requests_per_second: float = 10.0,
    user_agent: str = "infinium-python/1.0.0",
    enable_logging: bool = False,
    log_level: str = "INFO",
    verify_ssl: bool = True,
)

Methods

close() -> None

Close the HTTP client and release resources.


send_task(name, description, duration, current_datetime=None, **kwargs) -> ApiResponse

Send a trace with simple parameters.

ParameterTypeRequiredDescription
namestrYesTask name (max 500 chars)
descriptionstrYesTask description (max 10,000 chars)
durationfloatYesDuration in seconds (0-86,400)
current_datetimestrNoISO 8601 timestamp (auto-generated if omitted)
**kwargsNoAny field from TaskData (e.g., llm_usage, steps, customer)

Returns: ApiResponse


send_task_data(task_data: TaskData) -> ApiResponse

Send a complete TaskData object.

ParameterTypeRequiredDescription
task_dataTaskDataYesThe trace data to send

Returns: ApiResponse


send_tasks_batch(tasks, max_concurrent=5) -> BatchResult

Send multiple tasks in chunks.

ParameterTypeDefaultDescription
taskslist[TaskData]requiredTasks to send
max_concurrentint5Chunk size

Returns: BatchResult


get_interpreted_task_result(task_id: str) -> ApiResponse

Get Maestro’s interpretation for a trace.

ParameterTypeDescription
task_idstrTrace ID (UUID)

Returns: ApiResponse


wait_for_interpretation(trace_id, timeout=120.0, poll_interval=3.0) -> ApiResponse

Poll until Maestro finishes interpreting a trace.

ParameterTypeDefaultDescription
trace_idstrrequiredTrace ID (UUID)
timeoutfloat120.0Max seconds to wait
poll_intervalfloat3.0Seconds between polls

Returns: ApiResponse


get_prompt(prompt_id, prompt_key, version="latest", variables=None) -> PromptContent

Fetch a prompt from Prompt Studio.

ParameterTypeDefaultDescription
prompt_idstrrequiredPrompt UUID
prompt_keystrrequiredPrompt secret key
versionstr or int"latest"Version number or "latest"
variablesdict[str, str]NoneTemplate variables

Returns: PromptContent


trace(name, *, auto_send=True, description=None)

Decorator that auto-detects sync/async and traces the function.

ParameterTypeDefaultDescription
namestrrequiredTrace name
auto_sendboolTrueSend trace on completion
descriptionstrNoneOptional description

Returns: Decorator


get_current_iso_datetime() -> str (static method)

Returns current UTC datetime as ISO 8601 string.


create_task_data(name, description, duration, current_datetime=None, **sections) -> TaskData (static method)

Create a validated TaskData object. Raw dicts are auto-coerced to dataclasses.

ParameterTypeRequiredDescription
namestrYesTask name
descriptionstrYesTask description
durationfloatYesDuration in seconds
current_datetimestrNoISO timestamp
**sectionsNoAny TaskData field

Returns: TaskData


AsyncInfiniumClient

Asynchronous client for the Infinium API. Has the same constructor and methods as InfiniumClient, but all methods are async.

from infinium import AsyncInfiniumClient

All methods have the same signatures as InfiniumClient, but are coroutines:

response = await client.send_task(...)
response = await client.send_task_data(...)
result = await client.send_tasks_batch(tasks, max_concurrent=5, fail_fast=False)
response = await client.get_interpreted_task_result(...)
response = await client.wait_for_interpretation(...)
prompt = await client.get_prompt(...)
await client.close()

Additional parameter on send_tasks_batch():

ParameterTypeDefaultDescription
fail_fastboolFalseStop on first error

Context manager: Use async with for automatic cleanup.


Tracing

TraceBuilder

Incrementally construct a TaskData during agent execution.

from infinium import TraceBuilder

Constructor

TraceBuilder(name: str, description: str)

Methods

step(action: str, description: str) -> StepContext

Create an auto-timed, auto-numbered step. Use as a context manager.


set_input_summary(summary: str) -> TraceBuilder

Set the input summary.


set_output_summary(summary: str) -> TraceBuilder

Set the output summary.


set_expected_outcome(task_objective, *, required_deliverables=None, constraints=None, acceptance_criteria=None) -> TraceBuilder

Set the expected outcome for Maestro evaluation.

ParameterTypeDefaultDescription
task_objectivestrrequiredThe goal
required_deliverableslist[str]NoneWhat to produce
constraintslist[str]NoneRules to follow
acceptance_criterialist[str]NoneSuccess criteria

set_environment(*, framework=None, framework_version=None, python_version=None, runtime=None, region=None, custom_tags=None) -> TraceBuilder

Set runtime environment metadata.


set_llm_usage(llm_usage: LlmUsage) -> TraceBuilder

Set aggregate LLM usage statistics.


set_section(name: str, value: Any) -> TraceBuilder

Set a domain section by name (e.g., "customer", "research").


add_error(error: ErrorDetail) -> TraceBuilder

Record an error.


build(_trace_ctx=None) -> TaskData

Build the final TaskData. Auto-computes duration from wall clock. If _trace_ctx is provided, incorporates auto-captured LLM calls and prompt fetches.


All setter methods return self for fluent chaining:

trace.set_input_summary("...").set_expected_outcome(...).set_environment(...)

StepContext

Context manager for a single execution step. Created by TraceBuilder.step().

from infinium import StepContext

Constructor

StepContext(step_number: int, action: str, description: str)

Methods

set_input(preview: str) -> StepContext

Set input preview (truncated to 500 chars).


set_output(preview: str) -> StepContext

Set output preview (truncated to 500 chars).


record_tool_call(tool_name, *, duration_ms=None, input_summary=None, output_summary=None, http_status=None, error=None) -> StepContext

Record a tool invocation.

ParameterTypeDefaultDescription
tool_namestrrequiredName of the tool
duration_msintNoneCall duration in ms
input_summarystrNoneInput summary
output_summarystrNoneOutput summary
http_statusintNoneHTTP status code
errorErrorDetailNoneError details

record_llm_call(model, *, provider=None, prompt_tokens=None, completion_tokens=None, latency_ms=None, temperature=None, purpose=None) -> StepContext

Record an LLM call.

ParameterTypeDefaultDescription
modelstrrequiredModel identifier
providerstrNoneProvider name
prompt_tokensintNoneInput tokens
completion_tokensintNoneOutput tokens
latency_msintNoneCall latency in ms
temperaturefloatNoneTemperature
purposestrNonePurpose of this call

set_metadata(metadata: dict) -> StepContext

Attach arbitrary metadata to the step.


All methods return self for chaining. When used as a context manager, timing is automatic and unhandled exceptions are captured as ErrorDetail.


Decorators

trace_agent

Sync decorator for auto-instrumentation.

from infinium import trace_agent

@trace_agent(name: str, client: InfiniumClient = None, *, auto_send: bool = True, description: str = None)
def my_function(...):
    ...

async_trace_agent

Async decorator for auto-instrumentation.

from infinium import async_trace_agent

@async_trace_agent(name: str, client: AsyncInfiniumClient = None, *, auto_send: bool = True, description: str = None)
async def my_function(...):
    ...

Integrations

watch()

Auto-detect and monkey-patch a single LLM client instance.

from infinium.integrations import watch

patched_client = watch(client: T, capture_content: bool = False) -> T
ParameterTypeDefaultDescription
clientLLM clientrequiredOpenAI, Anthropic, Gemini, or xAI client
capture_contentboolFalseCapture input/output previews

Returns: The same client object (patched in place).

Supported clients: openai.OpenAI, openai.AsyncOpenAI, anthropic.Anthropic, anthropic.AsyncAnthropic, google.generativeai.GenerativeModel

OpenAIInstrumentor

Class-level instrumentation for OpenAI clients.

from infinium.integrations import OpenAIInstrumentor

instrumentor = OpenAIInstrumentor(capture_content: bool = False)
instrumentor.instrument()    # Patch at class level
instrumentor.uninstrument()  # Restore originals

AnthropicInstrumentor

Class-level instrumentation for Anthropic clients.

from infinium.integrations import AnthropicInstrumentor

instrumentor = AnthropicInstrumentor(capture_content: bool = False)
instrumentor.instrument()
instrumentor.uninstrument()

GoogleInstrumentor

Class-level instrumentation for Google Gemini models.

from infinium.integrations import GoogleInstrumentor

instrumentor = GoogleInstrumentor(capture_content: bool = False)
instrumentor.instrument()
instrumentor.uninstrument()

InfiniumOTelExporter

Optional OpenTelemetry span exporter.

from infinium.integrations.otel import InfiniumOTelExporter

exporter = InfiniumOTelExporter(
    service_name: str = "infinium-agent",
    tracer_name: str = "infinium",
)

exporter.export_trace(
    trace_name: str,
    duration_s: float,
    ctx: TraceContext = None,
)

Context Management

TraceContext

Mutable context attached to the current ContextVar token.

from infinium.integrations._context import TraceContext
FieldTypeDescription
callslist[CapturedLlmCall]Auto-captured LLM calls
promptslist[CapturedPromptFetch]Auto-captured prompt fetches

get_current_trace_context() -> Optional[TraceContext]

Return the active trace context, or None if outside a trace.

set_trace_context(ctx) -> contextvars.Token

Set the active trace context. Returns a reset token for restoring the previous context.


Utility Functions

Validation

from infinium import (
    validate_path_id,
    validate_string_length,
    validate_payload_size,
    validate_iso_datetime,
    validate_duration,
)

validate_path_id(value: str, field_name: str) -> str Validates UUID format for URL path segments. Raises ValidationError.

validate_string_length(value: str, field_name: str, max_length: int) -> str Validates string doesn’t exceed max length. Raises ValidationError.

validate_payload_size(payload: dict) -> None Validates serialized payload doesn’t exceed 1 MB. Raises ValidationError.

validate_iso_datetime(datetime_str: str) -> bool Returns True if the string is valid ISO 8601 format.

validate_duration(duration: Union[int, float]) -> float Validates duration is between 0 and 86,400 seconds. Raises ValidationError.

DateTime

from infinium import get_current_iso_datetime

timestamp = get_current_iso_datetime()  # "2024-01-15T10:30:00Z"

Data Conversion

from infinium import coerce_section

# Convert a dict to a dataclass
customer = coerce_section({"customer_name": "Acme"}, Customer)

Logging

from infinium import setup_logging

setup_logging(level="DEBUG")

Constants

from infinium import (
    MAX_NAME_LENGTH,        # 500
    MAX_DESCRIPTION_LENGTH, # 10,000
    MAX_STRING_LENGTH,      # 2,000
    MAX_PAYLOAD_SIZE,       # 1,048,576 (1 MB)
)

Exceptions

All exceptions inherit from InfiniumError. See Error Handling for full details.

from infinium.exceptions import (
    InfiniumError,
    AuthenticationError,
    ValidationError,
    RateLimitError,
    NetworkError,
    InfiniumTimeoutError,
    TimeoutError,        # Alias for InfiniumTimeoutError
    ServerError,
    NotFoundError,
    BatchError,
    ConfigurationError,
)