People love pointing out how bad GPT3 is at math - it’s just not remotely surprising, though. We all learned a long time ago that just patternmatching isn’t enough to solve math questions and we need to actually activate our brains and run algorithms to get valid answers.
Even mathy people fall into this - we all know that some numbers ‘look prime’. And up to 100 that instinct can work really well (except for 91).
But we all apply a heuristic for when to stop guessing and start mathing.
This thread predates chatgpt - but it really shows that GPT can pretty reliably spot when winging it won’t work. And in this case, it could effectively be given the ability to whip out a calculator to figure it out.
(Edited to add)
Found the follow up thread where Riley went into more detail and enhanced the results:
Even mathy people fall into this - we all know that some numbers ‘look prime’. And up to 100 that instinct can work really well (except for 91).
But we all apply a heuristic for when to stop guessing and start mathing.
But guess what! GPT3 can do that too!
https://twitter.com/goodside/status/1568448128495534081
This thread predates chatgpt - but it really shows that GPT can pretty reliably spot when winging it won’t work. And in this case, it could effectively be given the ability to whip out a calculator to figure it out.
(Edited to add)
Found the follow up thread where Riley went into more detail and enhanced the results:
https://twitter.com/goodside/status/1581805503897735168?s=20...