Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What do you propose are the biggest filters ahead? And what do you think are the biggest ones in the past that we overcame?


The usual, in both cases.

The biggest of the past: the relatively thin crust that allows non-catastrophic volcanism, the dual composition of the crust so we have both oceanic and continental plates and the right amount of water to get both massive oceans and not so much as to be a water world, surviving the oxygenation catastrophe, evolution of multicellular life/Cambrian explosion, that the various inorganic carbon sinks and sources are balanced well enough for billion year evolutionary history, that we’re cooperative enough to work together while being competitive enough to develop new tech for “our side” (too little competition and we’d have stopped in c. 1860 tech and communism; too much competition and we’d be at each other’s throats and not be able to sustain global supply chains needed for modern computers).

Arguably the Moon is a big part of many of these things.

My best guess for the future: as we get more tech, it gets easier for individual insane people to blow up important things and/or kill lots of people. AI (never mind AGI/ASI) is just one of many such technologies that make this risk bigger, but even just sufficiently cheap electricity and manufacturing makes it (relatively) simple to use a cyclotron to enrich uranium.

The only way I can see past that is a ridiculous and unpleasant level of surveillance that you need some kind of AI to be able to achieve in the first place, with all the downsides that come with that surveillance, and that’s still the case even if you’re “only using the surveillance AI to find and section dangerously insane people, honest”.


> My best guess for the future: as we get more tech, it gets easier for individual insane people to blow up important things and/or kill lots of people.

I've been saying this recently. I think EM or similar could potentially take and defend a big area with the massive number of drones he could buy or build companies to produce. And that's without inventing any new tech, just mass producing specialized drones.

That's scary enough, but then imagine some small nation-state you've never thought much about, that has all that manpower and collective wealth, and maybe some ambition...

Tech is able to concentrate a big amount of power into a very small number of hands.


Sounds like we’re on the same page. As I’ve thought about this, I can’t escape the disorienting feeling that many more filters are in the past than in the future (and ones with worse probabilities are in the past too). Do you perceive the same? And does the cumulative probability of future filters seem smaller than the cumulative probability of past filters?


Given all the exoplanets with no signs of life, either abiogenesis is probably very hard or life is very fragile (~1e-3 or fewer planets will both develop life and retain it for long enough to pass through an oxygenation catastrophe, but that's squished several filters together).

Difficult to do more than guess past filters beyond oxygenation given the sample size of n=1.

Future? All unknown-unknowns. Even if you rule out paper-clipping-AI-gone-wrong scenarios by assuming only weak and narrow AI slightly less than we have today, the mere ability to get a million colonists to Mars, even with just SpaceX's Starship, requires enough space industry to be a direct military threat to Earth.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: