Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

How do you automatically verify that an LLM isn't hallucinating?


Idk, ask LLMs researchers....they will figure it out sooner or later.


When people don't know how to do something but think it is easy, then it can often take 50 years or more for that to happen. Has happened before in this field.


Your faith is touching.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: