Creating an AI version of ourselves means that we should sit and talk with AI software for months every day 24 hour per day. Because, if you want it to learn you, then it has to ask you millions of questions. Letting AI to learn yourself means 100% fully transferring your privacy to AI. Does anyone really want it?
I guess they can also try to get a copy of your memory to avoid you trying to teach the Ai about your self. I wonder what such cases would look like trying to pass on your own consciousness into an ai technology. It's trash to me though. if we pass on it's better to give your inheritance to someone you trust like a family members instead of relying on some ai that can be controlled by a central authority. The dead should be allow to rest and not try to clone some shitty ai or robot to carry on with his existence.