Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think you can slice it whichever direction you prefer e.g. OpenAI needs more than "we ran it on 10x as much hardware" to end up with a really useful AI model, it needs to get efficient and smarter just as proportionally as it gets larger. As a side effect hardware sizes (and prices) needed for a certain size and intelligence of model go down too.

In the end, however you slice it, the goal has to be "make it do more with less because we can't get infinitely more hardware" regardless of which "why" you give.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: