Python SDK Reference

Complete reference documentation for the InfraPrism Python SDK.

Installation

pip install infraprism

Classes

InfraPrismOpenAI

Drop-in replacement for OpenAI’s OpenAI client.

from infraprism import InfraPrismOpenAI

client = InfraPrismOpenAI(
    api_key: str = None,                    # OpenAI API key
    organization: str = None,               # OpenAI organization ID
    base_url: str = None,                   # Custom base URL
    timeout: float = None,                  # Request timeout
    max_retries: int = 2,                   # Max retry attempts
    infraprism_api_key: str = None,         # InfraPrism API key
    infraprism_environment: str = None,     # Environment tag
    infraprism_disabled: bool = False,      # Disable tracking
    infraprism_debug: bool = False,         # Enable debug logging
)

Supported methods:

  • client.chat.completions.create() - Chat completions
  • client.completions.create() - Legacy completions
  • client.embeddings.create() - Embeddings

AsyncInfraPrismOpenAI

Async version of InfraPrismOpenAI.

from infraprism import AsyncInfraPrismOpenAI

client = AsyncInfraPrismOpenAI(...)

response = await client.chat.completions.create(...)

InfraPrismAnthropic

Drop-in replacement for Anthropic’s Anthropic client.

from infraprism import InfraPrismAnthropic

client = InfraPrismAnthropic(
    api_key: str = None,                    # Anthropic API key
    base_url: str = None,                   # Custom base URL
    timeout: float = None,                  # Request timeout
    max_retries: int = 2,                   # Max retry attempts
    infraprism_api_key: str = None,         # InfraPrism API key
    infraprism_environment: str = None,     # Environment tag
    infraprism_disabled: bool = False,      # Disable tracking
    infraprism_debug: bool = False,         # Enable debug logging
)

Supported methods:

  • client.messages.create() - Messages API
  • client.messages.stream() - Streaming messages

AsyncInfraPrismAnthropic

Async version of InfraPrismAnthropic.

from infraprism import AsyncInfraPrismAnthropic

client = AsyncInfraPrismAnthropic(...)

response = await client.messages.create(...)

InfraPrismAzureOpenAI

Client for Azure OpenAI deployments.

from infraprism import InfraPrismAzureOpenAI

client = InfraPrismAzureOpenAI(
    azure_endpoint: str = None,             # Azure endpoint URL
    api_key: str = None,                    # Azure API key
    api_version: str = None,                # Azure API version
    azure_ad_token_provider: Callable = None,  # Azure AD auth
    infraprism_api_key: str = None,         # InfraPrism API key
    infraprism_model_mapping: dict = None,  # Deployment to model mapping
    ...
)

AsyncInfraPrismAzureOpenAI

Async version of InfraPrismAzureOpenAI.

Extended Parameters

All create() methods accept these additional parameters:

ParameterTypeDescription
entity_typestrEntity type: customer, team, project, employee
entity_idstrYour internal identifier for the entity
session_idstrSession/conversation ID for grouping
tagsdictCustom metadata tags
infraprism_disabledboolSkip tracking for this request
response = client.chat.completions.create(
    model="gpt-4o",
    messages=[...],
    # InfraPrism parameters
    entity_type="customer",
    entity_id="acme-corp",
    session_id="conv-123",
    tags={"feature": "chatbot"},
)

Utility Methods

infraprism_flush()

Force upload of any pending events.

client.infraprism_flush()

Useful before application shutdown.

infraprism_disable()

Disable tracking for all subsequent calls.

client.infraprism_disable()

infraprism_enable()

Re-enable tracking after disabling.

client.infraprism_enable()

Environment Variables

VariableDescription
INFRAPRISM_API_KEYInfraPrism API key
INFRAPRISM_ENVIRONMENTDefault environment tag
INFRAPRISM_DISABLEDSet to true to disable
INFRAPRISM_DEBUGSet to true for debug logging
OPENAI_API_KEYOpenAI API key
ANTHROPIC_API_KEYAnthropic API key
AZURE_OPENAI_ENDPOINTAzure OpenAI endpoint
AZURE_OPENAI_API_KEYAzure OpenAI API key
AZURE_OPENAI_API_VERSIONAzure API version

Error Handling

InfraPrism passes through all provider errors unchanged:

from openai import OpenAIError, RateLimitError, APIError

try:
    response = client.chat.completions.create(...)
except RateLimitError:
    # Handle rate limiting
    pass
except APIError as e:
    # Handle API errors
    print(f"API error: {e}")
except OpenAIError as e:
    # Handle all OpenAI errors
    print(f"OpenAI error: {e}")

InfraPrism-specific errors:

from infraprism import InfraPrismError, InfraPrismAuthError

try:
    client = InfraPrismOpenAI(infraprism_api_key="invalid")
except InfraPrismAuthError:
    # Invalid API key
    pass

Type Hints

Full type hints are provided:

from infraprism import InfraPrismOpenAI
from infraprism.types import EntityType

def track_call(
    client: InfraPrismOpenAI,
    message: str,
    entity_type: EntityType,
    entity_id: str,
) -> str:
    response = client.chat.completions.create(
        model="gpt-4o",
        messages=[{"role": "user", "content": message}],
        entity_type=entity_type,
        entity_id=entity_id,
    )
    return response.choices[0].message.content

Logging

Configure logging to see InfraPrism debug output:

import logging

logging.basicConfig(level=logging.DEBUG)
logger = logging.getLogger("infraprism")
logger.setLevel(logging.DEBUG)

Or use the debug flag:

client = InfraPrismOpenAI(infraprism_debug=True)

Thread Safety

All clients are thread-safe. You can share a single client across threads:

from concurrent.futures import ThreadPoolExecutor

client = InfraPrismOpenAI()

def process(item):
    return client.chat.completions.create(
        model="gpt-4o",
        messages=[{"role": "user", "content": item}],
        entity_type="project",
        entity_id="batch-job",
    )

with ThreadPoolExecutor(max_workers=10) as executor:
    results = list(executor.map(process, items))

Next Steps