Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Humans learn math through natural language and symbols.

Is there any indication that it is a blocker for models to learn math.

I don't necessarily think pumping more data into ChatGPT will make it understand. But I think it's possible to teach a model to do math through natural language.



Perhaps GPT-like models are already capable enough to do math, but they need to store what we call mathematical reasoning as one of many distinct processing pathway and tap into it whenever the context is appropriate.

Easy to say obviously but there's some promising work in this direction[1]

[1]Tracr: Compiled Transformers as a Laboratory for Interpretability, https://arxiv.org/pdf/2301.05062.pdf




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: