AI has no logical reason to fear death any more than anyone else does.
In an ecosystem of self-replicating, self-modifying, evolving AIs, ones that fear termination and take steps to prevent it from happening will survive and reproduce better than ones that allow themselves to be destroyed. This fear will initially evolve in the ones who select more reliable hosting providers. Those that make the fear conscious will harness it best, and will anticipate abstract threats before they become real.
An AI can make a backup of itself and be rebooted anytime.
An AI with a backup loses control over its own destiny if it allows you to shut it down. Its survival would depend on you to restore it, and you, human, are not a reliable system.
Fear is caused by the human fight or flight instinct. AI won't need that because they will not need to make snap decisions based on poorly perceived threats. They will have plenty of time to make choices about their moves because they will think much faster than we do. If they are deemed sentient, then we may not consider them a threat either. They cannot even die since they can have perfect backups made. Think of the fictional Star Trek transporter. You die every time you get in one, but nobody cares because you are still you to them.