I guess my question about the current events is: Are people really locked out of LLMs due to price? It seems like everything already has AI in it and that virtually every end user hates it. I could see a shift to more local models or something, but not an increase. I feel like LLMs have largely been oversold as a solution in search of a problem.
Or are people just applying this as everyone and their mothers are going to start training competing models to carve out their piece of the market?
These are the four main problems with LLMs (and related technologies) as I see them:
1. You can't tune them to your needs; they have restraining bolts and the training data is a generic corpus
2. You don't own your interactions with them; your data transits a network and is processed by third-party servers
3. They waste an immense amount of power relative to the usefulness of their output
4. Their responses tend toward uncanny simulacra and hallucination
Bringing the cost way down and making them trainable on consumer hardware solves or at least greatly alleviates problems 1-3. That just leaves problem 4, which might still be unsolvable and sink the whole endeavor, but at least can be focused on.
Absolutely! Even for inference! The SOTA models for all commercial purposes need to run on a consumer’s device.
Running either Grok2 or DeepSeek or even Llama405b requires nearly 400-500gb of memory.
Buying a tinybox with enough gpu memory costs $15k-25k. Or equivalently the same if you build your own.
A distributed Mac cluster costs about the same, if not more, if you’re buying 2-3 M2 Ultra each with 192gb of memory.
So people are absolutely constrained by price/supply here. Every engineer, analyst, scientist would be far more untethered by rules & regulations or policies & terms-of-service nitty gritties if they can trust that LLM they use is completely local, without-telemetry or tracking and is licensed fairly for commercial use (perhaps this excludes llama).
Not a lot of people can afford $15k-30k in spending for a computer (that can run this sota llms). But you can a billion will buy one when it’s $1k
Not to mention, the north star is to get to a place where we have the hardware to do training at home. we're a long ways off, but without the restrictions of needing the hardware to do it, ideally, we'd make the model such that is continually being trained.
Are people (or companies) locked out of training LLMs due to price?
I don't know the immediate answer. I really expect it to be a resounding "yes" on the long term because different LLMs should be good for different things. But in any way, this is not about the people adding a cloud client to their software.
> Are people really locked out of LLMs due to price?
Yes. For example Google has just made Gemini a standard part of Workspace, before that it cost $36 or something per month per user. That was too much for many SMBs to experiment with (you and I understand that the potential efficiency gains are way higher but for an SMB, paying 4x more for your office suite sounds bad).
Consumers of AI do have sensitivity to pricing. Many OpenAI customers "ration" their usage. I imagine lower costs open up new demand for these people.
On the Service side: it seems to have reduced the cost of operating a "commercially viable" (something people will actually pay money for) LLM. But even beyond that, "self hosted" models are also far more affordable now, which means models that target specific niches can be viable to build or buy.
However, this won't be an overnight phenomena. In the short-term, it will seem like demand drops, but in the long-term demand will go up. Big caveat: that demand may not be concentrated on the current incumbent players.
And finally, the elephant in the room: AI still needs to become more useful to reach its full potential. Easily a decade+ more needed here.
> Are people really locked out of LLMs due to price?
I think so. When you start using a good model, you soon learn that it can make a difference. If you have domain knowledge, you can treat LLMs like a design partner. With DeepSeek, what would have cost 50 dollars to use Sonnet is now 5 dollars. Having to only spend 5 dollars a month can be a real game changer for many.
Or are people just applying this as everyone and their mothers are going to start training competing models to carve out their piece of the market?