So I’m watching Captain America: The First Avenger for the first time in a while, and I realized: WWII is basically part of our mythology. It’s not that it’s not real---it’s that it’s historical and also more than historical. We feel like our role in the war says something deeper and truer about America than the mere historical facts can communicate.
The thing is… does it?