Battenbergcake said:
Now it?s a common fact that the American army came to aid the allied forces engaged across Europe.
However Americans believed this meant they single handily turned the tide of the war and ?won it?.
Man, what massive sweeping generalizations.
"Now it's a common fact that" - yeah, right, you wish. Humankind is so rife with conflicting opinions that there exists no such thing as a common fact. In America, we've got a few people who aren't completely convinced the Civil War is over, who believe the moon landing was faked, and you actually think there's going to be a common agreement over here that we only "came to the aid of allied forces"? About the only thing we agree about in America is to disagree about everything.
"the American army came to the aid of allied forces" - Is this supposed to be a major revelation? It's not like we came into World War II and said, "okay, all the rest of you stop fighting, we'll do everything now." Initially, we actually were going to completely butt out of this war, but Pearl Harbor [http://en.wikipedia.org/wiki/Attack_on_Pearl_Harbor] rather forced us to participate.
"However Americans believed this meant they single handily turned the tide of the war" - Sounds to me that what you need to prove here is that, if you removed America from the equation of World War II, the allies would have won anyway. Good luck with that, this is something that a Historian wouldn't claim unless they were high as a kite, as it involves massive reimaginings of history. If you want to speculate that hard, lets say that if the allies did win without America, Stalin would have decided the opportunity was ripe to take over the entire land mass.
"and ?won it?" - Exactly how does one "win" war? It's a massive clusterfuck of death and destruction brought about by failed diplomacy. I will say, however, that it sure did wonders for the American economy. WW-II dragged us up out of the second great depression into a massive economic boom. Pity we pissed it all away over the next 50 years.
Like it or not, whether or not America
wanted to be part of the war, or was even the
largest influence, we did
finish the war... with the nuclear weaponry we invented for the occasion. And we've been living with the pall of the potential annihilation of the entire human race hanging over our heads ever since. Still want to underplay American involvement?