MicroDC Python Client Library¶
The MicroDC Python Client Library provides a simple, intuitive interface for submitting inference jobs to the MicroDC.ai distributed inference platform.
Package Name: microdc-client | Import Name: microdc | Python: 3.8+
Quick Start¶
from microdc import Client, LLMComplete
# Initialize client
client = Client(api_key="mDC_499FC19C-686A-47C5-AA93-E619C55EBE98")
# Create and configure a job
job = LLMComplete(model="llama3.3")
job.set_prompt("Why is the sky blue?")
# Submit and wait for results
job_id = client.send_job(job)
client.wait_for_all()
# Get results
result = client.get_job_details(job_id)
print(result.result)
Features¶
- Multiple Job Types -- LLM generation, chat, embeddings, and document processing
- Callback-Based Async -- Background polling with automatic callback invocation
- File Upload Support -- Upload files and create download tokens for multimodal workflows
- Automatic Retries -- Exponential backoff for transient errors
- Type Safe -- Full type hints for IDE autocomplete and static analysis
- Context Manager -- Clean resource management with
withstatements
Job Types¶
| Class | Purpose | Key Method |
|---|---|---|
LLMComplete |
Single-turn text generation | set_prompt() |
LLMChat |
Multi-turn conversations | add_user_message() |
LLMEmbed |
Text embeddings | add_texts() |
DocumentCall |
Document processing | add_file() |
Installation¶
See the Installation Guide for more options.
Next Steps¶
- Installation -- Set up the library
- Quick Start -- Submit your first job
- Configuration -- Configure the client
- API Reference -- Full API documentation