Docs
LLmHub API Temperature Parameter
LLmHub API Temperature Parameter
Official API documentation for LLmHub API (api.llmhub.dev)
The default value of temperature
is 1.0.
We recommend users to set the temperature
according to their use case listed below:
USE CASE | TEMPERATURE |
---|---|
Coding / Math | 0.0 |
Data Cleaning / Data Analysis | 1.0 |
General Conversation | 1.3 |
Translation | 1.3 |
Creative Writing / Poetry | 1.5 |
Examples
Python
# Example of setting temperature for coding tasks
response = client.chat.completions.create(
model="automatic",
messages=[{"role": "user", "content": "Write a function to sort an array"}],
temperature=0.0 # Lower temperature for deterministic code generation
)
Node.js
// Example of setting temperature for creative writing
const completion = await openai.chat.completions.create({
model: "automatic",
messages: [{"role": "user", "content": "Write a poem about autumn"}],
temperature: 1.5 # Higher temperature for more creative outputs
});
Why Temperature Matters
Lower temperature values (0.0-0.7) produce more deterministic and focused responses, making them ideal for tasks requiring precision like coding and mathematical calculations.
Higher temperature values (1.0-2.0) introduce more randomness and creativity, making them better suited for creative writing, brainstorming, and conversational applications.