I think you are confused about what an AI is.
AI doesn't imply self-awareness or conciousness.
You can make an AI that just regulates stuff without ever asking questions.
Yeah I mentioned that earlier. You need intelligence so that your machines can be autonomous and creative in their decision-making.
But you want to avoid self-awareness because you want to make sure they will obey.
Consciousness is only desirable if you want to create a computational replica of your mind. If you want immortality or something.
Well, i don't think you want creativity in this case.
The whole idea of RBE is that the decisions the AI takes are more scientifically sound than what humans could oversee. So the idea is that it needs to be based on facts, not creativity.
Atonomy is not a problem per se. Your computer does lots and lots of autonomous things.
The problem is maybe that we would not like the cold hard decisions of such a system would make without our personal concent and with no human emotions to fall back on.That's exactly why I would prefer the new intelligence to be sentient and/or endowed with consciousness...
But wait... What do we mean when we say endowed with consciousness? Do we mean merely self-aware, or also emotionally aware of other living things?
The latter is what I am calling for.
Yeah, well, there is a problem with that.
Emotions is what makes humans unpredictable.
Emotions is what makes humans evil.
Greed is an emotion.
And that is besides the point that emotion is even more specific than human intelligence.
So any emotion we build into an AI will be fake. If you want real emotion you would need to evolve it and so you would have to present the same kind of environment to the developing mechanism to make it develop these qualia we call emotions.
In other words, we have no such possibilities of creating emotional machines.