The Zygon
2018-08-10 01:32:00 UTC
I think that it would be a bad idea for human beings to develop robots to the point of human-equivalent intelligence and self awareness. First, I don't see why once they have got that far they would stop there. They would be bound to surpass us, possibly to a level we are unable to even understand. And they may not be benevolent.
But even if they are benevolent, what could we be to such superior beings except pets? It seems to be that the best we can hope for in such a case is that they decide to be uninvolved in our affairs. And if they are uninvolved, why build them (the early versions, that is) in the first place?
In the Ian M Banks Culture universe, it seems to me that it would not be wholly inaccurate to consider the humanoids in the Culture society to be the pets of the Minds. But even if so, they seem to be highly indulgent masters.
But even if they are benevolent, what could we be to such superior beings except pets? It seems to be that the best we can hope for in such a case is that they decide to be uninvolved in our affairs. And if they are uninvolved, why build them (the early versions, that is) in the first place?
In the Ian M Banks Culture universe, it seems to me that it would not be wholly inaccurate to consider the humanoids in the Culture society to be the pets of the Minds. But even if so, they seem to be highly indulgent masters.