Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

And thus eliminate the reason many (if not most) of these books and webpages existed in the first place.

This may be fair in the short term, but on a longer timescale, it leaves us worse off unless LLMs master reasoning about and being creative with things not in their training set. Possible, but a crapshoot right now.



Keep in mind that success at this endeavor means that UBI becomes possible. With UBI we can expect to see more human creativity, not less, as humans are no longer forced to do a robot's job.

Copyright law cannot be allowed to stand in the way of this. If it does, break it. It's that important.


In absolutely no way, no way at all, does any of this imply that UBI will exist. It is no more reasonable to expect a society structurally against the idea that suffering should be avoided because we can avoid it to adopt UBI because of large language models than any other reason. Penury (for others) is a feature, not a bug, here.

Couple "the cruelty is the point" with figuring out exactly who will be giving up resources for UBI, and you have a very long road to hoe to make that a reality.


> Keep in mind that success at this endeavor means that UBI becomes possible

UBI is already technologically possible. Yes, as automation becomes even more powerful and abundant, the feasible payouts from UBI will increase. That's never been the problem. The problem is that hundreds of millions of Americans (and a significant portion of everyone else in the world) would rather live in poverty as indentured servants to their rich masters than ever see those people benefit even a tiny bit from any government policy. Cruelty and suffering is literally what they want in the world; they fight tooth and nail to prevent anything that might uplift all of humanity (themselves included). They are deeply disturbed, horrible, irredeemable pieces of shit, and they're the majority.

And the very tiny minority of people who could ever do anything about this are exactly the ones who are poised to take all the benefits from automation and AI. I'm not scared of AI. I'm scared of humans. And very soon, the humans who already own everything are going to have no need for the rest of us.


> Keep in mind that success at this endeavor means that UBI becomes possible.

UBI is always possible. The achievable minimum support level might be higher with this, but also, with this occurring without UBI existing first, so will the concentration of economic and political power in the hands of entrenched interests who would want to prevent UBI (since it would be redistributing resources outward from them.)


If copyright were at least set back to 14 years, people would still write books.

After that we could explore other economic models that make sense in a world of internet and AI, instead of blindly sticking with the system we designed for the world of the printing press.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: