Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Clear about what? Do you know the difference between an LLM based on transformer attention and a monte carlo tree search system like the one used in Go? You do not understand what they are saying. It was a fine tuned model, just as DeepSeekMath is a fine tuned LLM for math, which means it was a special purpose model. Read the OpenAI GitHub IMO submissions to see the proof.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: