Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It’s apt, because the only thing LLMs is hallucinate because they have no grounding in reality. They take your input and hallucinate to do something “useful” with it.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: