I wonder if semi-reliable RAM could be made to work for training. After all gradient descent already works in a stochastic environment, so maybe the noise from a few flipped bits doesn't matter too much.
Well, it kind of depends. With XMP (which is overclocking) I've found plenty of kits on Ryzen not passing memtest with the XMP settings. Different CPUs seem to be able to run their memory controller harder without error.
And then there are other factors like more sticks of ram stressing things further. I had to downclock to get memtest stable when running 4 sticks even though each kit ran fine on it's own. But that is expected as well as 4 sticks stresses the memory controller even further.
I confess I don't have any real recent experience with DDR5 though, mostly with DDR4 on Ryzen 1000-5000 series.
this reminds me of something i've observed. it seems like there is a general trend in software of doing things that look good (either in an ad or in a sprint review) rather than things that feel good to use. one example among many is nvidia's frame generation feature, which makes 60 fps look like 120 fps when you're watching somebody else play, but feel like 30 fps when you're the one playing.
Image and projection of that image is very important for most humans.
You just need to look at how some people dress in order to "look good" even though it often requires them to make some ridiculous compromises on confort.
They will not, because need to save at least weak competition, or anti-trust regulators will use very high taxes against Nvidia.
This is reason, why Intel all previous decades saved tiny stripe for competitors (sure, AMD , but also like Cyrix or Sys), but immediately hit brakes, when some competitor becomes too competitive - to show regulators, that market is still competitive, is not just monopoly.
The size of stripe for outsiders is not right parameter here, but more important outsiders will not show bright products in most important niches.
So idea, Arc will not die fast, but it will constantly lag, to be only second or third.
How one could cut wings to GPU? Well first, delay top products, for example installing slow RAM and use too high temperature margins, so chip will run on slower frequency than could.
Second, as I hear, Arc drivers still not ideal and some games don't run smooth.
Third, cut all long term perspective initiatives, like WebGPU.
ps Other examples, you may seen strange behavior of IBM, Commodore/Atari, when they avoid to implement some very obvious things, and that is - they visited by regulators, and warned about approaching of formal margin, and after that visit, hit brakes, to limit their products, to avoid become next ATT.
Which is arguably kind of weird because where is it actually competing with NVIDIA? A hypothetical future, I guess?
But also, does this amount of ownership even give them the ability to kill anything on Intel's roadmap without broad shareholder consensus (not that that's even how roadmaps are handled anyway)?
reply