Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Have you tried giving it basic logic that isn't in its training data?

I have. gpt-3.5-instruct required a lot of prompting to keep it on track. Sonnet 4 got it in one.

Terrence Tao, the most prominent mathematician alive, says he's been getting LLM assistance with his research. I would need about a decade of training to be able to do any math in a day that Tao can't do in his head in less than 5 seconds.

LLMs are suffer from terrible, uh, dementia-like distraction, but they can definitely do logic.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: