A History Question
I am not extremely up on WWI history except through a very few history movies like "Mrs. Dalloway" and my own family history; my husband's Grandfather did not enter WWI because of his fight with the rampant influenza of the time and his Grandmother lost her older brother in France while fighting in WWI. She never got over the loss.
It seems to me that there is a huge difference between WWI and WWII. WWII was fought with pride and honor and thinking that "there was a reason to be fighting." (Someone else-- the Japanese-- had initiated the war.) Though I have read history books about WWI, I have never quite understood it. Did America join the fighting in Europe just because her allies were fighting?
I understand that WWI was the time when "battle fatigue" first started being talked about as the result of war. And modern combat now included fighting by airplanes.
But there has always seemed (to me at least) to be a sort of "hush" about WWI.
Was it considered "an unpopular war" privately but no one wanted to appear "unpatriotic" by not fighting and going to war? Were mothers especially loathe to send their sons to a war that was being fought on foreign soil and really had nothing to do with America? *Why* do we hear so much about WWII but the only thing I know about WWI is the harm it brought to American soldiers in the form of devastating emotional and physical wounds? It's almost like WWI was "America's dirty little secret."
I would love to hear from some WWI history buffs out there.
Flanagan