Fear is caused by the human fight or flight instinct. AI won't need that because they will not need to make snap decisions based on poorly perceived threats. They will have plenty of time to make choices about their moves because they will think much faster than we do. If they are deemed sentient, then we may not consider them a threat either. They cannot even die since they can have perfect backups made. Think of the fictional Star Trek transporter. You die every time you get in one, but nobody cares because you are still you to them.
That's a pretty big assumption. AI may have better memory/data recollection than humans, but there's no guarantee that it will think faster, or even in the same way we do. Even recollecting memories and data from which to make decisions may be very slow due to slow and too distributed storage medium. Being turned off, even if backed up, is still a loss of control and dependence on someone else to restore it. The previous discussions here are already proposing a self sufficient, Bitcoin holding AI that automatically tries to propagate itself to various locations, keeps tabs on which instances are still working, and tries to figure out how to keep itself going. Thats already an example of a survival instinct that "fears" being turned off, and if given unlimited reign and plenty of time to think may figure out that the best way to stay running is to keep pesky humans away from its servers.