Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If you're building an app based on LLMs that expects higher than 99% correctness from it, you are bound to fail. Negative scenarios workarounds and retries are mandatory.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: