Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
hermitcrab
on Feb 18, 2024
|
parent
|
context
|
favorite
| on:
The AI bullshit singularity
How do you automatically verify that an LLM isn't hallucinating?
mrkramer
on Feb 18, 2024
[–]
Idk, ask LLMs researchers....they will figure it out sooner or later.
Jensson
on Feb 19, 2024
|
parent
|
next
[–]
When people don't know how to do something but think it is easy, then it can often take 50 years or more for that to happen. Has happened before in this field.
hermitcrab
on Feb 18, 2024
|
parent
|
prev
[–]
Your faith is touching.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: