As the result of an argument in one of my classes, I'm curious what people here think about cultural imperialism (western imperialism. I don't just mean McDonalds and Pokemon, I mean the suppression of native cultures to supplant Western values. We did it with aborigines basically everywhere. The question is: is this a bad thing?
If Western culture brings with it technology, advancement, better medicine, improve crops, higher quality of life, and the cost is the lost of provincial and parochial local customs, what's the issue? I may be an apologist for Western civilization, but the reason we've been running the world for a large part of human history (especially if I get to claim Rome and the Hellenic states as "Western") is because of our cultural traditions. It feels like Kipling was right, that we should expect the blame of those we better, and the hate of those we guard.
So, what's your thinking?
If Western culture brings with it technology, advancement, better medicine, improve crops, higher quality of life, and the cost is the lost of provincial and parochial local customs, what's the issue? I may be an apologist for Western civilization, but the reason we've been running the world for a large part of human history (especially if I get to claim Rome and the Hellenic states as "Western") is because of our cultural traditions. It feels like Kipling was right, that we should expect the blame of those we better, and the hate of those we guard.
So, what's your thinking?