So I understand that, for the US, WW2 was considered a liberal war with liberal goals, such as the Four Freedoms and internationalism, etc. And it was a victory for the US, followed by a few years of economic prosperity. But if that's the case, then how come American politics soon swung so sharply to the right? The conservatives took over in the 1952 elections. What happened? It seems like such a contradiction.
No comments:
Post a Comment