americans didn't win the world war
i may sound very nationalist but i have to say this.
why do all americans think they won WWII on their own.
the british and the french started the fight against the nazis then france fell and britain, for a whole year where on their own, america refused to fight with us! they only fought when they were attacked whereas we fought because nazis were bullying europes small countries.