Creating an AI version of ourselves means that we should sit and talk with AI software for months every day 24 hour per day. Because, if you want it to learn you, then it has to ask you millions of questions. Letting AI to learn yourself means 100% fully transferring your privacy to AI. Does anyone really want it?
I guess they can also try to get a copy of your memory to avoid you trying to teach the Ai about your self. I wonder what such cases would look like trying to pass on your own consciousness into an ai technology. It's trash to me though. if we pass on it's better to give your inheritance to someone you trust like a family members instead of relying on some ai that can be controlled by a central authority. The dead should be allow to rest and not try to clone some shitty ai or robot to carry on with his existence.
How will they be able to get a copy of your memory? Explain please, I didn't understand.
Trying to pass your own consciousness into an AI technology doesn't sound well for me because these crazy people like Trump and Putin will do that and after their death, we might have their AI versions but at this point I think we aren't at that level because it's impossible to force someone sit for months for 24 hours a day and accurately answer asked questions. If we invent an equipment that will be able to read our thoughts, then that will be a gamechanger.
By the way, I don't understand why should someone want to let commercial company control its money through a fake AI consciousness instead of giving it to their own relatives. Some people like crazy ideas. We have seen how some people give their wealth to their dog or cat, how some people marry on mannequin and how many crazy things they do.