Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Training a state-of-the-art LLM is currently at least in the $100ks. That stands to drop rapidly, but it's currently more along the lines of "the branches of one million willow trees".

So long as it's not something an individual can easily achieve, regulations can seriously hinder development. The FDA kept the COVID vaccine from general use for nearly a year because they have a regulatory apparatus that companies know better than to ignore. We had a baby formula shortage because the FDA said "no, you can't use EU-approved baby formula until we approve it. Now there's an Adderall shortage because the government said "make less of this" and everyone said "yes, sir, whatever you say sir."

There's certainly a good deal of regulation-violation and wrist-slapping in our world, but regulations get mostly followed, especially when the enforcement is severe enough.



If the $100k is just “gpu time” it’s certainly within the reach of many people - not even super rich.

And maybe bitcoin miners could be repurposed for it or something.


This may be in the $10^7 category now, but is there any reason to believe it will never be $10^3?

Oddly the most pressing concern is “increased productivity”.


Unless it has something like the intentional self-latching of bitcoin mining, I do not see how it wouldn't rapidly drop in price.

And if the models can be built once and then distributed, then it will certainly leak at some point, even if just intentionally by a hostile actor.


I can’t see how it could be conceived To work like bitcoin. The only reason that works is because a majority of humans agree to use the same code.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: