Architecture¶
This page provides an overview of the MicroDC Python Client Library architecture. For the full design document, see DESIGN.md.
High-Level Architecture¶
graph TB
App["User Application"]
subgraph lib["MicroDC Client Library"]
Client["<b>client.py</b><br/>Authentication & Sessions<br/>Job Submission & Tracking<br/>File Upload & Tokens<br/>Callback Management"]
subgraph jobs["jobs/"]
BaseCall["base.py<br/><i>BaseCall</i>"]
LLMComplete["llm_complete.py<br/><i>LLMComplete</i>"]
LLMChat["llm_chat.py<br/><i>LLMChat</i>"]
LLMEmbed["embed_call.py<br/><i>LLMEmbed</i>"]
DocumentCall["document_call.py<br/><i>DocumentCall</i>"]
JobDetails["job_details.py<br/><i>JobDetails</i>"]
end
subgraph core["core/"]
HTTP["http.py<br/><i>HTTPTransport</i>"]
Polling["polling.py<br/><i>PollingManager</i>"]
Config["config.py<br/><i>Config</i>"]
end
subgraph exceptions["exceptions/"]
Errors["errors.py<br/><i>MicroDCError, APIError,<br/>AuthenticationError, ...</i>"]
end
end
subgraph api["MicroDC Server REST API"]
Submit["POST /api/v1/jobs/submit"]
GetJob["GET /api/v1/jobs/{id}"]
Ack["POST /api/v1/jobs/{id}/acknowledge"]
Upload["POST /api/files/upload"]
end
App --> Client
Client --> jobs
Client --> core
Client --> exceptions
HTTP --> api
Module Responsibilities¶
| Module | Purpose | Key Classes |
|---|---|---|
client.py |
Main API interface, job management | Client |
jobs/base.py |
Abstract job interface | BaseCall |
jobs/llm_complete.py |
Single-turn generation | LLMComplete |
jobs/llm_chat.py |
Multi-turn chat | LLMChat |
jobs/embed_call.py |
Embedding generation | LLMEmbed |
jobs/document_call.py |
Document processing | DocumentCall |
jobs/job_details.py |
Job result data model | JobDetails |
core/http.py |
HTTP transport layer | HTTPTransport |
core/polling.py |
Background job polling | PollingManager |
core/config.py |
Configuration management | Config |
exceptions/errors.py |
Error handling | MicroDCError, etc. |
Job Type Hierarchy¶
classDiagram
BaseCall <|-- LLMComplete
BaseCall <|-- LLMChat
BaseCall <|-- LLMEmbed
BaseCall <|-- DocumentCall
class BaseCall {
<<abstract>>
+metadata: Dict
+priority: str
+timeout: int
+to_api_payload()* Dict
+validate()*
}
class LLMComplete {
+model: str
+temperature: float
+set_prompt(prompt)
}
class LLMChat {
+model: str
+temperature: float
+set_system(content)
+add_user_message(content)
+add_assistant_message(content)
}
class LLMEmbed {
+model: str
+normalize: bool
+add_text(text)
+add_texts(texts)
}
class DocumentCall {
+model: str
+file_tokens: List
+add_file(token)
+add_files(tokens)
}
Design Principles¶
- Simplicity First -- Intuitive API for beginners, powerful for advanced users
- Type Safety -- Full type hints for IDE support and static analysis
- Async-First with Sync Support -- Callbacks for async, blocking waits for sync
- Extensibility -- Base class design supports future job types
- Fail-Safe Defaults -- Sensible defaults (temperature=0.7, auto polling)
API Endpoint Mapping¶
| Client Method | HTTP | Endpoint |
|---|---|---|
send_job(*) |
POST | /api/v1/jobs/submit |
get_job_details(id) |
GET | /api/v1/jobs/{id} |
get_job_status(id) |
GET | /api/v1/jobs/{id}/status |
cancel_job(id) |
DELETE | /api/v1/jobs/{id} |
acknowledge_job(id) |
POST | /api/v1/jobs/{id}/acknowledge |
list_jobs() |
GET | /api/v1/jobs |
upload_file() |
POST | /api/files/upload |
create_download_token(id) |
POST | /api/v1/files/{id}/create-download-token |
Polling Mechanism¶
The client uses a background thread for automatic job status polling:
- Background thread starts on client initialization (configurable)
- Polls every 2 seconds (configurable via
poll_interval) - When a job completes, invokes the registered callback
- Thread-safe for concurrent job submissions
Retry Strategy¶
Automatic retry for transient errors:
- Attempts: 3
- Base delay: 1 second
- Backoff: 2.0x exponential
- Retryable codes: 408, 429, 500, 502, 503, 504
HTTP Transport¶
Uses httpx.Client for:
- HTTP/2 support and connection pooling
- Configurable timeouts
- Multipart file upload support
- Foundation for future async/await API