Apple doesn’t even sell NVidia cards on their Mac Pros. Are they training it on Linux?
I think Apple would strive to be great at all computing related tasks. “Oh, Macs are not good for that, you should get a PC” should make them sad and worried.
AI/LLM is the new hot thing. If people are using Windows or Linux, you’re loosing momentum, hearts and minds… and sales, obviously.
If a train with a GM diesel engine delivers raw materials to a Ford factory for making F150s, you would conclude that consumers whould start driving trains to work?
Not at all, just that the engineers at Ford would be more proud if the train used their own diesel engine. And that this kind of thing affects public perception. “Ford is not for heavy duty”
Apple doesn't even support NVidia cards on their Mac Pros. The technical details are above my head, but the way Apple M* chips handle PCIe make them incompatible with GPUs and other accelerator cards. Whether you use macOS or Linux.
AFAIK all their cloud services run on x86 hardware with Linux, including Xcode Cloud (which runs macOS in a QEMU VM, in a way that only Apple can technically and legally do).
But no one is training these kinds of models on their personal device. You need compute clusters for that. And they will probably run Linux. I'd be surprised if Microsoft trains their large models in anything else than Linux clusters.
> But no one is training these kinds of models on their personal device
on-device transfer learning/fine tuning is def a thing for privacy and data federation reasons. Part of the reason why model distillation was so hot a few years ago.
Apple would want to train models as fast as they could. Nvidia provides an off the shelf solution they can just buy and use for a very reasonable price and sell on the second hand market.
If they wanted to use their own hardware they would either need more of it, which would cost a lot and divert production from sellable devices; or they would need to make special chips with much bigger neural engines, which would cost even more.
Also Apple uses public clouds for service stuff. They may not even own any hardware and just be renting it from AWS/Azure/GCP for training.
> I think Apple would strive to be great at all computing related tasks. “Oh, Macs are not good for that, you should get a PC” should make them sad and worried.
What percent of Apple's customers train models? Does it even crack 1%?
Apple already fails for many types of computing, e.g. any workflow that requires Windows or AAA gaming.
They used to sell the best developer’s machines. Before Docker, at least. Don’t developers need to train models? And if they are doing so on Linux or Windows, wouldn’t that make it easier for them to better or exclusively support the end product on the one they’re more used to?
You want to be the platform where the excitement is happening.