Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

For nuclear weapons, even if plans were publicly available, getting the right kind of uranium would still be a major hurdle for 99.9% of the people.


But what if they weren't? What if one could create something with similar destructive force with materials easily accessible, and the only hurdle is knowledge of how to do so?

It's honestly an interesting moral question.


This is explored further in Nick Bostrom's Vulnerable World Hypothesis. He believes that not even a total global surveillance system would be able to stop a single teenager with the recipe for a homemade pathogen. Nothing changes about the laws of the universe in that case, only the fact that the information was not in our minds before and then it is, and we collectively cannot forget or uninvent it.

We as a species have only survived ~75 years since we became aware of nuclear weapons, which is only a small portion of our overall history.

This has taught me to be mindful of the state of being unaware, because once you are aware of something it is impossible to reverse unless you're an amnesiac. Collectively forgetting something as a society then becomes infeasible without the technology and the willingness to revert our knowledge to a safer state, and a single outlier can ruin the entire system. When Jobs held up the first smartphone and we all started cheering and fantasizing, did any of us think it would lead to this?

(I also think that a pill to forget the experience of [addictive thing] would be revolutionary, but sadly our ingrained curiosity might undo the effect shortly afterward. A lot of tech takes advantage of our curiosity.)

This makes me wonder: what compels us to keep researching AI despite us also being able to hypothesize all these game-over scenarios? Why does it feel like we have no choice but to be crushed by unbounded progress?

Perhaps 50 years from we may find that nothing short of halting all research in the name of curiosity was the only option that could have preserved our culture/existence/values for another few decades.


> When Jobs held up the first smartphone and we all started cheering and fantasizing, did any of us think it would lead to this?

Jobs didn’t hold up the first smartphone and smartphone proliferation has little to do with the AI research that led to deepfakes.


I was more trying to point out that what we might hope comes out of new technology in the future may not necessarily come true, or the benefits might come with as-yet-unknown downsides that only become visible by the technology being proliferated extensively (and irreversibly).


You keep saying “we” and “us”. Who the fuck are you talking about? Speak for yourself.


Over time the number of single humans that have the power to wipe out a sizeable portion of life on Earth (which is almost certainly non-zero already) will almost certainly increase due to advances in technology and dissemination of information.

Do we aim for security through obscurity or work out how to deal with that increasingly terrifying reality? I think at some point we will be forced to do latter.


This trend is clearly real, but what exactly do you mean “work out how to deal with that?” One of the obvious ways to deal with it, albeit imperfectly, is in fact to keep these technologies as obscure as possible for as long as possible.

Maybe an interesting norm to try to generate in this community would be, “if you plan on publishing attack X, you should publish immediately alongside it your best guess as to the effective counter to X.”


Keeping these technologies in the hands of a few has its own set of severe negative consequences.

In the case of nuclear weapons, one of those consequences is that poor countries are denied the ability to defend themselves.

This led to the bullying and invasion of countries without nuclear weapons by countries with nuclear weapons (most recently Russia invading Ukraine, but China and the US are equally guilty).

I'm reminded of a line from the TV show The West Wing, something like: "India must and will be a nuclear power. That way, we'll never be dictated to again."


Even an extreme example -- The technology to build an automated drone with a camera for targeting and a gun on a gimbal and deploy it to do a mass-shooting -- all exists today. All it would take is for some madman to pick the pieces up and assemble them. In fact, I'd be willing to bet $100 that we'll actually see some kind of tragedy to that effect in the USA within the next 5 years. The technology cats are very quickly getting out of the bags.


Many airsoft players are mounting fully automatic airsoft rifles on drones and some Russian dude already put a glock on a 1st gen DJI phantom years ago. My guess is a big thing stopping such an attack from happening is that the people doing these attacks are usually not well educated, so they don't have the arduino-starter-kit level of electronics knowledge required to pull it off.


Mental instability happens on all portions of the education and intelligence spectrum though - so if that's what we're banking on we're going to have a very rude awakening.


Your example is not extreme. Off-the-shelf "drone bombers" have already been used in the Middle East.


So then what about releasing data on how to enrich uranium at home or selling it on <web store>? I know usually there are multi-million dollar facilities required for enriched uranium production but why shouldn't the free market allow people to buy enriched uranium - it isn't our place to morally prejudge what people may use it for and the bad people will have access to it anyways... so why can't I buy a kilo of U-235 on Amazon?

I'm sure there are some very worthy technical efforts to improve mankind that are being hampered by the lack of easily accessible U-235 so why are we restricting this material from Chemists that want to Chem and Physicists that want to Fizz?


But what are the consequences of the .1% of people who get over that hurdle?


Having something that can still be detected by most of the planet and making them very angry.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: