Sunday, 22 February 2009

The End of the West

Back in the 1900s, a landmark book was published by Oswald Spengler, called the "Decline of the West". This book was something of a best seller in the era, a shocker and an eye-opener, so to speak.

But the West did not die as he predicted. Spengler, a German, had seen the First World War and the disaster it brought. Yet he did not see the rise of the USA as the standard-bearer of Western values. Old Europe was aging out, and losing its creative stream, but the US was ready to make the West live on. After World War II, it emerged as the quintessential Western power, representing Western values throughout the world.

In 2000, there were 1000 years since the West began its growing global domination. In 1100, the newly consolidated Western states launched their first expansionary war in the East: the Crusades. The unsuccessful Crusades were later followed by a more methodical conquest of the Americas, the Indies and Africa by the rising Western nations. It was first the Spanish, the Portuguese, the Dutech, and then the English and the French. In the process, the Westerners destroyed or transformed the empires of the Aztec, the Indies or the Japanese. They irremediably changed the cultures and lives of billions of people.

Yet an unexpected - unnoticeable thing - happened after the West established its colonies around the world. The colonies began to alter the West as well. Like it or not, we live in the shadow of the 1800s colonization. We see it today everywhere - the rise of the East - Japan, China and India, the growing immigration from outside Europe, the alteration of American and European demographics, the unrest in the Arab countries etc.

This may sound a bit, you know, "geopolitical", but it affects our everyday lives. We know now that economical problems in China can affect the West. We know that communities are becoming increasingly ethnically and racially diverse. We see it in the movies, on TV, in popular culture, how being white European does not automatically mean exclusivity to fame, power or fortune anymore.

Of course, the election of Barack Obama as President of the USA has been one of the starkest images of how the West is receding in influence. Here he is, the son of a Kenyan black man, the President of the most powerful nation on Earth that used to embody the quintessential West. Just by seeing this charismatic half-black, half-white man speaking for the entire U.S. is an amazing and telling sight. The West is slowly, but surely disappearing. In its stead, there is a growing mixed, "mutt" culture that is moving from the outskirts of the cities toward its exclusive center.

Should we mourn the West's passing? Back in the 1900s, Spengler's book sounded like doomsday was coming near: the end of the West was the end of all. But the death of the West is not a bang, but a whimper: a slow transformation into something else. The West is disappearing, but perhaps this is not a disaster. Surely, the West has brought a lot of good things, but I'm afraid a lot of bad things as well - the polarization of the world, antagonism and war. Perhaps in the future, there will be no West or East, but one diverse planet.

No comments:

Post a Comment