Naked Capitalism has the article Mussolini-Style Corporatism, aka Fascism, on the Rise in the US. This article has an introduction to, and then republishes the Thom Hartman article on Alternet Tea Party and the Right: The Sad Truth of Our Politics: It’s Basically Turned into a Competition Among Oligarchs to Own Everything: It could still happen here.
In all the years I have known about fascism, I have never seen the definition that clarifies what the word really means until reading this article.
As the 1983 American Heritage Dictionary noted, fascism is, “A system of government that exercises a dictatorship of the extreme right, typically through the merging of state and business leadership, together with belligerent nationalism.”
All I ever really knew was that fascism is what Mussolini did, but I never knew what exactly he did that was called fascism. Are there other people whose education about history was as faulty as mine? Do you suppose there is a reason why people in the 1950s and 1960s and beyond were never taught the true meaning?