Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

ML inference and training are not the same task.


This plot is about general GPU performance, not pure inference. https://www.apple.com/newsroom/2022/03/apple-unveils-m1-ultr...

Training the model requires inference for forward propagation, so even then, for your comment to be relevant, you'd need to find a plot that Apple uses to compare inference on quantized models versus Nvidia, which doesn't exist.


...and doing either of those things with CUDA is impossible on Mac. Why? Because Apple burned their bridge with Nvidia and threw a temper tantrum, that's why. Now Nvidia can't support MacOS, even if they wanted.

That's kinda the point of my original comment. Apple claims to know what's best, but contradict themselves through their own actions. We wouldn't be in awkward situations like this if Apple didn't staunchly box-out competitors and force customers to follow them or abandon the ecosystem. It's almost vindicating for people like me, who left MacOS because of these pointless decisions.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: