Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> The faster we build the future, the better.

Famous last words.

It's not the fall that kills you, it's the sudden stop at the end. Change, even massive change, is perfectly survivable when it's spread over a long enough period of time. 100m of sea level rise would be survivable over the course of ten millennia. It would end human civilization if it happened tomorrow morning.

Society is already struggling to adapt to the rate of technological change. This could easily be the tipping point into collapse and regression.



False equivalence. Sea level raise is unequivocally harmful.

While everyone getting Einstein in a pocket is damn awesome and incredibly useful.

How can this be bad?


Because there’s a high likelihood that that’s not at all how this technology is going to be spread amongst the population, or all countries and this technology is going to be way more than Einstein in a pocket. How do you even structure society around that? What about all the malicious people in the world. Now they have Einsteins. Great, nothing can go wrong there.


>What about all the malicious people in the world. Now they have Einsteins.

Luckily, so do you.


I’m thinking of AI trained to coordinate cybersecurity attacks. If the solution is to deeply integrate AI into your networks and give it access to all of your accounts to perform real-time counter operations, well, that makes me pretty skittish about the future.


What would that help? Counter mad-men with too powerful weapons is difficult and often lead to war. Classic wars or software/DDoS/virus wars or robot wars or whatever.


You can use AI to fact-check and filter malicious content. (Which would lead to another problem, which is... who fact-checks the AI?)


This is where it all comes back to the old "good guy with a gun" argument.


There’s a great skit on “The Fake News With Ted Helms” where they’re debating gun control and Ted shoots one of the debaters and says something to the effect of “Now a good guy with a gun might be able to stop me but wouldn’t have prevented that from happening”.


There is a very, very big difference between "tool with all of human knowledge that you can ask anything to" and "tool you can kill people with".


The risk is there. But it's worth the risk. Humans are curious creatures, you can't just shut something like this in a box. Even if it is dangerous, even if it has potential to destroy humanity. It's our nature to explore, it's our nature to advance at all costs. Bring it on!


> How can this be bad?

Guys, how can abestos be bad, it's just a stringy rock ehe

Bros, leaded paint ? bad ? really ? what, do you think kids will eat the flakes because they're sweet ? aha so funny

Come on freon can't be that bad, we just put a little bit in the system, it's like nothing happened

What do you mean we shouldn't spray whole beaches and school classes with DDT ? It just kills insects obviously it's safe for human organs


We thought the same 25 years ago, when the whole internet-thing started for the broader audience. And now, here we are, with spam, hackers, scammers on every corner, social media driving people into depressions and suicides and breaking society slowly but hard.

In the first hype-phase, everything is always rosy and shiny, the harsh reality comes later.


World is way better WITH internet than it would have been without it. Hackers and scammers are just price to pay.


The point is not whether it's better or worse, but the price with paid and sacrifices we made along the way, because things were moving too fast and with too little control.


For example, imagine AI outpacing humans when it comes to most economically viable activities. The faster it happens, the less able we are able to handle the disruption.


The only people complaining are a section of comfortable office workers can probably see their places being possibly made irrelevant.

The vast majority don't care and that loud crowd needs to swallow their pride and adapt like any other sector has done in the history instead of inventing these insane boogeyman predictions.


We don't even know what kind of society we could have if the value of 99.9% of peoples labor (mental or physical) dropped to basically zero. Our human existence has so far been predicated and built on this core concept. This is the ultimate goal of AI, and yeah as a stepping stone it acts to augment our value, but the end goal does not look so pretty.

Reminds me of a quote from Alpha Centauri (minus the religious connotation):

"Beware, you who seek first and final principles, for you are trampling the garden of an angry God and he awaits you just beyond the last theorem."


We’re all going to be made irrelevant and it will be harder to adapt if the things change too quickly. It may not even be us that needs to adapt but society itself. Really curious where you get the idea this is just a vocal minority of office workers concerned about the future. Seems like a lot of the ones not concerned about this are a bunch of super confident software engineer types which isn’t a large sample of the population.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: