The word "West" used to have a meaning. It described common goals and values, the dignity of democracy and justice over tyranny and despotism. Now it seems to be a thing of the past. There is no longer a West, and those who would like to use the word -- along with Europe and the United States in the same sentence -- should just hold their breath. By any definition, America is no longer a Western nation.The point is that only Europe aka Deutschland is West and America is a mulatto Third World "banana republic", firmly ruled by an "elite". The article resonates something in my memory.
Friday, August 05, 2011
What Germans Think
Der Spiegel publishes in English an article unusually anti-American. Says: