Docs
Python SDK

Python SDK

Official Python SDK for the Lumnis AI API

Overview

The official Python SDK for Lumnis AI provides a simple, intuitive interface for building AI-powered applications. Full type hints and async support included.

Installation

pip install lumnisai

Quick Start

from lumnisai import Client, display_progress
 
# Initialize the client
client = Client(api_key="your-api-key")
 
# Create a simple response
response = client.invoke("What is the meaning of life?")
print(response.output_text)
 
# With streaming
for update in client.invoke("Explain quantum computing", stream=True):
    display_progress(update)
    
    if update.state == "completed":
        print(f"\n{update.output_text}")

Configuration

Environment Variables

export LUMNISAI_API_KEY="your-api-key"
export LUMNISAI_TENANT_ID="your-tenant-id"  # Optional

Client Initialization

from lumnisai import Client
 
# Using API key directly
client = Client(api_key="your-api-key")
 
# Using environment variables (LUMNISAI_API_KEY)
client = Client()
 
# Custom configuration
client = Client(
    api_key="your-api-key",
    timeout=60.0,  # 60 second timeout
    max_retries=3  # Retry up to 3 times
)

Async Support

All methods have async equivalents using AsyncClient:

import asyncio
from lumnisai import AsyncClient, display_progress
 
async def main():
    client = AsyncClient(api_key="your-api-key")
    
    # Create response asynchronously
    response = await client.invoke("Hello!")
    print(response.output_text)
    
    # Streaming with display_progress
    async for update in await client.invoke("Analyze this data", stream=True):
        display_progress(update)
        
        if update.state == "completed":
            print(f"\n{update.output_text}")
 
asyncio.run(main())

Using AsyncClient in Jupyter/Colab

from lumnisai import AsyncClient, display_progress
 
# No need for asyncio.run() in notebooks
client = AsyncClient(api_key="your-api-key")
 
# Direct await
response = await client.invoke("What are the latest AI trends?")
print(response.output_text)
 
# Streaming with display_progress
async for update in await client.invoke("Research topic", stream=True):
    display_progress(update)

Context Managers

from lumnisai import Client
 
# Automatic cleanup
with Client(api_key="your-api-key") as client:
    response = client.invoke("Hello!")
    print(response.output_text)
 
# Async version
async with AsyncClient(api_key="your-api-key") as client:
    response = await client.invoke("Hello!")
    print(response.output_text)

User-Scoped Operations

from lumnisai import Client
 
client = Client(api_key="your-api-key")
 
# Create a user-scoped client
user_client = client.for_user("user@example.com")
response = user_client.invoke("What's the weather?")
 
# Or use as context manager
with client.as_user("user@example.com") as user_client:
    response = user_client.invoke("What's the weather?")

Next Steps

Explore the different SDK capabilities:

Support

License

MIT License © Lumnis AI