Python SDK Reference
Complete reference documentation for the InfraPrism Python SDK.
Installation
pip install infraprism
Classes
InfraPrismOpenAI
Drop-in replacement for OpenAI’s OpenAI client.
from infraprism import InfraPrismOpenAI
client = InfraPrismOpenAI(
api_key: str = None, # OpenAI API key
organization: str = None, # OpenAI organization ID
base_url: str = None, # Custom base URL
timeout: float = None, # Request timeout
max_retries: int = 2, # Max retry attempts
infraprism_api_key: str = None, # InfraPrism API key
infraprism_environment: str = None, # Environment tag
infraprism_disabled: bool = False, # Disable tracking
infraprism_debug: bool = False, # Enable debug logging
)
Supported methods:
client.chat.completions.create()- Chat completionsclient.completions.create()- Legacy completionsclient.embeddings.create()- Embeddings
AsyncInfraPrismOpenAI
Async version of InfraPrismOpenAI.
from infraprism import AsyncInfraPrismOpenAI
client = AsyncInfraPrismOpenAI(...)
response = await client.chat.completions.create(...)
InfraPrismAnthropic
Drop-in replacement for Anthropic’s Anthropic client.
from infraprism import InfraPrismAnthropic
client = InfraPrismAnthropic(
api_key: str = None, # Anthropic API key
base_url: str = None, # Custom base URL
timeout: float = None, # Request timeout
max_retries: int = 2, # Max retry attempts
infraprism_api_key: str = None, # InfraPrism API key
infraprism_environment: str = None, # Environment tag
infraprism_disabled: bool = False, # Disable tracking
infraprism_debug: bool = False, # Enable debug logging
)
Supported methods:
client.messages.create()- Messages APIclient.messages.stream()- Streaming messages
AsyncInfraPrismAnthropic
Async version of InfraPrismAnthropic.
from infraprism import AsyncInfraPrismAnthropic
client = AsyncInfraPrismAnthropic(...)
response = await client.messages.create(...)
InfraPrismAzureOpenAI
Client for Azure OpenAI deployments.
from infraprism import InfraPrismAzureOpenAI
client = InfraPrismAzureOpenAI(
azure_endpoint: str = None, # Azure endpoint URL
api_key: str = None, # Azure API key
api_version: str = None, # Azure API version
azure_ad_token_provider: Callable = None, # Azure AD auth
infraprism_api_key: str = None, # InfraPrism API key
infraprism_model_mapping: dict = None, # Deployment to model mapping
...
)
AsyncInfraPrismAzureOpenAI
Async version of InfraPrismAzureOpenAI.
Extended Parameters
All create() methods accept these additional parameters:
| Parameter | Type | Description |
|---|---|---|
entity_type | str | Entity type: customer, team, project, employee |
entity_id | str | Your internal identifier for the entity |
session_id | str | Session/conversation ID for grouping |
tags | dict | Custom metadata tags |
infraprism_disabled | bool | Skip tracking for this request |
response = client.chat.completions.create(
model="gpt-4o",
messages=[...],
# InfraPrism parameters
entity_type="customer",
entity_id="acme-corp",
session_id="conv-123",
tags={"feature": "chatbot"},
)
Utility Methods
infraprism_flush()
Force upload of any pending events.
client.infraprism_flush()
Useful before application shutdown.
infraprism_disable()
Disable tracking for all subsequent calls.
client.infraprism_disable()
infraprism_enable()
Re-enable tracking after disabling.
client.infraprism_enable()
Environment Variables
| Variable | Description |
|---|---|
INFRAPRISM_API_KEY | InfraPrism API key |
INFRAPRISM_ENVIRONMENT | Default environment tag |
INFRAPRISM_DISABLED | Set to true to disable |
INFRAPRISM_DEBUG | Set to true for debug logging |
OPENAI_API_KEY | OpenAI API key |
ANTHROPIC_API_KEY | Anthropic API key |
AZURE_OPENAI_ENDPOINT | Azure OpenAI endpoint |
AZURE_OPENAI_API_KEY | Azure OpenAI API key |
AZURE_OPENAI_API_VERSION | Azure API version |
Error Handling
InfraPrism passes through all provider errors unchanged:
from openai import OpenAIError, RateLimitError, APIError
try:
response = client.chat.completions.create(...)
except RateLimitError:
# Handle rate limiting
pass
except APIError as e:
# Handle API errors
print(f"API error: {e}")
except OpenAIError as e:
# Handle all OpenAI errors
print(f"OpenAI error: {e}")
InfraPrism-specific errors:
from infraprism import InfraPrismError, InfraPrismAuthError
try:
client = InfraPrismOpenAI(infraprism_api_key="invalid")
except InfraPrismAuthError:
# Invalid API key
pass
Type Hints
Full type hints are provided:
from infraprism import InfraPrismOpenAI
from infraprism.types import EntityType
def track_call(
client: InfraPrismOpenAI,
message: str,
entity_type: EntityType,
entity_id: str,
) -> str:
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": message}],
entity_type=entity_type,
entity_id=entity_id,
)
return response.choices[0].message.content
Logging
Configure logging to see InfraPrism debug output:
import logging
logging.basicConfig(level=logging.DEBUG)
logger = logging.getLogger("infraprism")
logger.setLevel(logging.DEBUG)
Or use the debug flag:
client = InfraPrismOpenAI(infraprism_debug=True)
Thread Safety
All clients are thread-safe. You can share a single client across threads:
from concurrent.futures import ThreadPoolExecutor
client = InfraPrismOpenAI()
def process(item):
return client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": item}],
entity_type="project",
entity_id="batch-job",
)
with ThreadPoolExecutor(max_workers=10) as executor:
results = list(executor.map(process, items))
Next Steps
- REST API Reference - HTTP API documentation
- Analytics API - Query cost data
- Configuration - Configuration options