I know this may sound weird, but I do get this suspicion that there is something wrong with us, the way we live, the way we treat each other and the way we feel we need to control others to give ourselves more power.
This is not about America, this is about life on this Earth. Something feels wrong and off, it always has
Would you rather your feelings remain as just suspicions, or would you like to see relevant evidence that would convert them into knowledge?
The Truth can never be said or really put into words. People don't like change and they will do what ever is necessary to preserve their own way of life.
Even if I really knew what was going on, I would never talk about it because then I would be putting myself in harms way and risking my own self and credibility, simply put someone would destroy my character.
We put ourselves in this delicate situation we call life.