Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
orbital-decay
5 months ago
|
parent
|
context
|
favorite
| on:
Defeating Nondeterminism in LLM Inference
By setting the temperature to 0 you get greedy decoding, which does
a lot
more than just making it predictable, and can degrade outputs. Random sampling exists for a reason! Gemini 2.5 Pro in particular doesn't like temp 0, for example.
Focus on correctness, not determinism.
empiko
5 months ago
[–]
Determinism does not require temperature=0. You can have a deterministic behavior even with >0 temperature as long as you fix your random seeds.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search:
Focus on correctness, not determinism.