People also take medicine for every little thing.
That's a completely different problem though completely true.
Medicine that does not cure what ails them. Only covers up the symptoms.
That's mainly because lots of diseases can't be cured, they have to be fought by the body not by medecines.
Big pharma has taken over the world.
The world I don't know but they have a big influence that's for sure.
The wool has been pulled over everyone's eyes.
If your doctor gives you a drug that does not cure you, he is admitting that he is incompetent.
Or that he made a mistake which can happen to anyone. But most doctors give you drugs to cure you... At least in my country xD