I know this may sound weird, but I do get this suspicion that there is something wrong with us, the way we live, the way we treat each other and the way we feel we need to control others to give ourselves more power.
This is not about America, this is about life on this Earth. Something feels wrong and off, it always has
Would you rather your feelings remain as just suspicions, or would you like to see relevant evidence that would convert them into knowledge?