Chat Conversations (LLMChat)¶
LLMChat is for multi-turn conversational interactions using system, user, and assistant messages.
Basic Usage¶
from microdc import Client, LLMChat
client = Client(api_key="mDC_...")
chat = LLMChat(model="gpt-4")
chat.set_system("You are a helpful assistant.")
chat.add_user_message("What is Python?")
job_id = client.send_job(chat)
client.wait_for_all()
result = client.get_job_details(job_id)
print(result.result)
Building a Conversation¶
Use the helper methods to build message history:
chat = LLMChat(model="gpt-4")
# Set the system prompt
chat.set_system("You are a helpful coding assistant.")
# Add the user's first message
chat.add_user_message("How do I read a file in Python?")
# You can also include prior assistant responses for context
chat.add_assistant_message("You can use the open() function...")
chat.add_user_message("How about writing to a file?")
Helper Methods¶
| Method | Description |
|---|---|
set_system(content) |
Set the system message |
add_user_message(content) |
Add a user message |
add_assistant_message(content) |
Add an assistant message |
Configuration Options¶
chat = LLMChat(
model="gpt-4", # Required: model name
temperature=0.7, # Sampling temperature (0.0-2.0)
max_tokens=500, # Maximum tokens to generate
top_p=1.0, # Nucleus sampling
top_k=None, # Top-k sampling
frequency_penalty=0.0, # Frequency penalty
presence_penalty=0.0, # Presence penalty
stop=None, # Stop sequences
stream=False # Enable streaming
)
Custom Type Tracking¶
Route responses to different handlers using metadata:
from microdc import Client, LLMChat
def callback(client: Client, job_id: str):
details = client.get_job_details(job_id)
job_type = details.metadata.get("type")
if job_type == "summarization":
handle_summarization(details)
elif job_type == "translation":
handle_translation(details)
if details.is_successful():
client.acknowledge_job(job_id)
client = Client(api_key="mDC_...")
client.set_callback(callback)
# Summarization job
job1 = LLMChat(model="llama3.3")
job1.set_system("You are a summarizer.")
job1.add_user_message("Summarize: ...")
job1.metadata = {"type": "summarization", "doc_id": "123"}
client.send_job(job1)
# Translation job
job2 = LLMChat(model="llama3.3")
job2.set_system("You are a translator.")
job2.add_user_message("Translate to Spanish: Hello world")
job2.metadata = {"type": "translation", "target_lang": "es"}
client.send_job(job2)
client.wait_for_all()
Multimodal Chat¶
LLMChat supports multimodal input: