People often talking about how USA was the good and Japan was the evil in World War 2, but when you see the history of every country involved, you start to find hard to take sides in any conflict, even though I hate nazis I can't see the Russians or Brittish or French as the good side, those were all imperialistic countries that killed and enslaved millions of people before Hitler, and kept doing so after he was gone. I believe one the biggest reasons for Hitler's rise into power was exactly Great Britain and France foreigner politics.
With regards to the Second World War and almost every other war, the aggressor is the one who brought the war and therefore the bad side, the Japanese did a preemptive strike on the USA though they were not officially in the war.
That does not make sense at all, so every independent country in the new world and africa were the bad guys declaring war in their colonizers? Also, in the Second World War the United States was more than involved with trade routes and aiding England in the war effort. The Japanese saw no other way to win the conflict but to attack first, by the way things were playing out, they had no other choice. It was attacking or being suffocated by the Allies policy.
Their other choice was to stop invading other countries during their aggressive global expansion attempts. Japan's actions were condemned by the League of Nations so they withdrew from the organization and continued their efforts. The US thought they were still negotiating in good faith with Japan when the attack on Pearl Harbor played out. I get that just from a strategic standpoint, the only way to continue their world conquest was to control the Pacific and to do that they had to take out the US Pacific fleet so their only chance was the element of surprise but saying they had no other choice isn't really accurate IMHO. They were pretty clearly the aggressors in the conflict so they had plenty of chances to end things peacefully.