I see alot about world war 2 not happening till America said it did well our text books usually talk about U.S involvement as a main and then you other guys as a sub topic. Most Americans know the war technically started in 1939 but it wasnt a world war till us and japan got seriously involved...
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.