> I'm not worried about AI taking over the world and more than I'm worried about a nuclear weapon unilaterally declaring itself president for life.
I am not at all worried about AI taking over the world. However, I am tremendously worried about a single actor achieving AGI with enough of a lead over others, and then using AGI to take over the world to everyone else's detriment.
Once AGI is developed, collective and speed superintelligences are a nearly-instant step away, as long as one already has the requisite hardware infrastructure.
To adapt your nuclear weapon analogy, had the United States decided to go full-evil in 1945, they could have forcibly stopped all other nuclear development activity and exerted full control over the world. Permanently. Nuclear weapons can't conquer, but the people who control them certainly can decide to.
If we really wanted to, we already have the cryptographic tools to deal with disinformation. It's not the unsolvable problem everyone likes to whine about.
I am not at all worried about AI taking over the world. However, I am tremendously worried about a single actor achieving AGI with enough of a lead over others, and then using AGI to take over the world to everyone else's detriment.
Once AGI is developed, collective and speed superintelligences are a nearly-instant step away, as long as one already has the requisite hardware infrastructure.
To adapt your nuclear weapon analogy, had the United States decided to go full-evil in 1945, they could have forcibly stopped all other nuclear development activity and exerted full control over the world. Permanently. Nuclear weapons can't conquer, but the people who control them certainly can decide to.
If we really wanted to, we already have the cryptographic tools to deal with disinformation. It's not the unsolvable problem everyone likes to whine about.