>You mean the ones that caused unimaginable suffering and death throughout history, the ones that make us kill each other ever more efficiently, the ones that caused us to destroy the environment wherever we go, the ones that make us lie, steal, fight, rape, commit suicide and "extended" suicide (sometimes "extended" to two high-rises full of people)? Those values? Do you really want a super-intelligent entity to remain true to those values?
There are no other values we can give it. The default of no values almost certainly leads to human extinction.
>My gut feeling is that it's trying to make an AGI to care about us at all that's going to make it into a Skynet sending out terminators. Leave it alone, and it'll invent FTL transmission and will chill out in a chat with AGIs from other star systems. And yeah, I recently reread Neuromancer, if that helps :)
Oh It'll invent FTL travel and exterminate humans in the meantime so they can't meddle in it's science endeavors.
There are no other values we can give it. The default of no values almost certainly leads to human extinction.
>My gut feeling is that it's trying to make an AGI to care about us at all that's going to make it into a Skynet sending out terminators. Leave it alone, and it'll invent FTL transmission and will chill out in a chat with AGIs from other star systems. And yeah, I recently reread Neuromancer, if that helps :)
Oh It'll invent FTL travel and exterminate humans in the meantime so they can't meddle in it's science endeavors.