Last week, my column presented alternate arguments for why the U.S. would possibly not want to enter Uganda amidst the Kony 2012 craze. Since I do not title my own columns, The Knox Student decided to call it “Keeping Western Hands off of Africa.” This led me to wonder, what does “Western Hands” mean? Africa is a continent, I understand that (though I was only talking about Uganda and the Democratic Republic of Congo), but who are “Western Hands”?
I used the “find” button on Microsoft Word and looked for “west,” “western” and “hands” in my original article. No Matches. I thought that the search must not be working, but when I entered “U.S.” 17 matches came up. So, when discussing the U.S. intervening in one nation in the continent of Africa, why was the title framed in the context of “West” and “Africa”?
If you talk to me, I’ll probably slip and use the term “western” or “the west,” luckily it’s only cropped up in one of my columns thus far, when I said “Islam is a convenient political tool for Western Democracies as much as it is to Terrorist Organizations.” But what does “Western” really mean?
Most other labels are easy to identify, though the “Islamic World” doesn’t exist – all the Islamic countries in the world, or even those with sizable Muslim populations, do not get together to hash out all things Muslim — the label still makes a little sense: we are identifying countries that claim to be Islamic or have Islam as the state religion. “Africa” is a continent. “Communist Blocks” made sense at the time; we were identifying places with a communist system (or their interpretation of a communist system, more likely). But how do you define “The West”?
We could use democracy, but that would expand the term to include countries like Iran, Nicaragua, Egypt, Japan, Azerbaijan, etc. So democracy is out. We could use capitalism, but then most of the world would be Western at this point, with the exception of China (kind of), Cuba and some other nations.
In my very in-depth research, consisting of a Google search and Wikipedia article, I did not find a definition of the “Western World.” Instead, Wikipedia defines “Western Culture” as “a term used very broadly to refer to a heritage of social norms, ethical values, traditional customs, religious beliefs, political systems and specific artifacts and technologies. The term has come to apply to countries whose history is strongly marked by European immigration or settlement, such as the Americas and Australasia, and is not restricted to Western Europe.”
I would argue that most of the world has a history “strongly marked by European immigration or settlement,” especially if we include the strong mark of imperialism and colonization. So if Britain ruled three-quarters of the world, wouldn’t that mean that three-quarters of the world is western? Also, just a side note, at the time Europe saw itself as the center of the world, so shouldn’t the term be “cent-ern” or something? I digress.
From that wiki definition though, I’d still like to know why some other places aren’t Western, just “westernized”?
And then an idea leaped into existence! Since the dawn of man (or at least recorded history) we’ve seen how some groups/nation states in times of war have talked about the bestiality of the enemy. “They are not anything better than uncivilized animals, and we the (insert name here) are so much more cultured and (insert positive attribute here) than the rest.” And in times of history where we like some other nation or culture (very rare, but it happens), we say “Oh look how much we’ve (circle one: inherited from/inspired) them, they are (insert positive attribute here) just like us.”
So who make up the “Western World,” the “Islamic World,” the “Communists” or “Africa”? Here at least the list goes: “Awesome people,” “evil people,” “stupid people,” and “people who will always need charity from the awesome ones.” A very simple, boxy world to fit our simple, boxy labels.