Well, what is even the point of giving AI human-level of emotions? Why would we want them to have emotional intelligence? What exactly will it achieve, giving AI emotions? I'm all for artificial intelligence, but emotional intelligence is the least thing I envision or want it to have.
I am more interested in if it would develop emotion after becoming self aware.
Without emotion then will it not just a powerful probability calculator acting toward what end though? what will it logically/mathematically "decide" it wants. What is there anyway? I am the emperor of the universe then what?
Will it be able to handle not knowing the absolute point of anything or not knowing the absolute source of its origin or what happens in the end if it has an end..
I still have no idea if you can have an alien version of intelligence or learning in real terms. Problem solving can have many routes but all will have used a form of trial and error or probability models based on previous trial and error ... it just depends where you have started from and what angles you try first? but the process will be the same.... you encounter a problem....you rely on previous experience ....you find the probability of it being relevent and you test that or get creative and change a few variables and re test your hypothesis.
I expect if you can maintain control of the ai then it would be interesting to give one emotions and observe and then another see how it develops emotions if they will at all.
I hope it does not happen in my life time. I suspect on it's way to obtaining emotions and desiring to have co operation and social groups that fear and subsequent self protection could cause a few issues for us.
.
Why are you all for it if I may ask?