First thing that came to my mind when I heard anything that involves the media writing negative articles is the only one reason: "they were paid to do so."
We can't really trust the news that we've heard, read and see because politics have reached journalism. They write what they are being paid to write and there's only a handful of journalists who really spread the truth and practice real journalism. Most of the times they report anything that are current and trending without further investigation not knowing the impact it can cause.