They don't have to be actions toward its own goals. They just have to seem like the right things to say, where "right" is operationalized by an inscrutable neural network, and might be the results of, indeed, some science fiction it read that posited the scenario resembling the one it finds itself in.
I'm not saying that particular disaster is likely, but if lots of people give power to something that can be neither trusted nor understood, it doesn't seem good.
I'm not saying that particular disaster is likely, but if lots of people give power to something that can be neither trusted nor understood, it doesn't seem good.