This is something that's been bothering me for a pretty long time, about the time that I became aware that there's much more to the world than the little northeastern corner of the US that I live in. There's a TL
R version at the bottom of this post.
Why does America continue to disappoint me? I grew up with my American history classes, the ones on stuff we've invented, global issues we've solved, wars we won, all that stuff. What happened to all of that? This really isn't an "AMERICA F*** YEAH!!" rant, it's much the opposite actually, we I'm quite disappointed in what I see in society today. We brought the world the telegraph, the wrench, the revolver, the first assembly line, and countless others. We were a big part in winning WWII (I'm saying that as delicately as possible to avoid all of my fellow European, Canadian and beyond Escapees from flaming me to pieces) and all that great stuff.
What happened to all of this, and what do we have to show as the pride of our society today? We don't have any major exports aside from tourists who are apparently really easy to spot, we're seen as fat, whining, pushy loudmouths who are so used to instant gratification that we don't know the value of hard work (I was called just this by my grandfather years ago over something I don't remember, minus the fat part, he was right though) All in all, it's just embarrassing to be American sometimes. From what I've gathered, we have a horrible image in the world today, partly our doing, and partly because we're currently partaking in a war that doesn't have a clear justification as to why we're there.
Between the whole anti-Obama faction decrying everything that the man does (Fox News and the shouting town hall meetings and such) and a very publicized broken governmental system, it's not easy to find things to be happy about. We constantly have Senators and Representatives become involved in scandals. These are our highest elected officials, and to say that they're human and make mistakes is one thing, to say that flying to Argentina for a mistress on the state taxpayer's dime falls under the purview of humanism(?) is totally another. Maybe this whole issue I'm having is not totally with the American people, maybe it's with the government who has enable the people to come this way. Or at least to let me perceive them this way.
I'm not embarrassed to be American, I love my country, a bit short of jingoism, but I still love it. I'm just embarrassed by what we've become today. I'm incredibly proud of what we've managed to achieve in less that 200 years, but it seems like we're still chewing the fat instead of innovating the world. Sure we've got the absurdly progressing Silicon Valley projects making computers faster and smarter, but I can't find much else to be happy about with us right now. What do you guys think? Am I right at all? Being melodramatic? Or totally wrong.
I'm having trouble figuring out what I should be proud of as an American anymore. It's easy to fall back on the history books to look for justification, but I want to find something recent. Maybe it's just
Why does America continue to disappoint me? I grew up with my American history classes, the ones on stuff we've invented, global issues we've solved, wars we won, all that stuff. What happened to all of that? This really isn't an "AMERICA F*** YEAH!!" rant, it's much the opposite actually, we I'm quite disappointed in what I see in society today. We brought the world the telegraph, the wrench, the revolver, the first assembly line, and countless others. We were a big part in winning WWII (I'm saying that as delicately as possible to avoid all of my fellow European, Canadian and beyond Escapees from flaming me to pieces) and all that great stuff.
What happened to all of this, and what do we have to show as the pride of our society today? We don't have any major exports aside from tourists who are apparently really easy to spot, we're seen as fat, whining, pushy loudmouths who are so used to instant gratification that we don't know the value of hard work (I was called just this by my grandfather years ago over something I don't remember, minus the fat part, he was right though) All in all, it's just embarrassing to be American sometimes. From what I've gathered, we have a horrible image in the world today, partly our doing, and partly because we're currently partaking in a war that doesn't have a clear justification as to why we're there.
Between the whole anti-Obama faction decrying everything that the man does (Fox News and the shouting town hall meetings and such) and a very publicized broken governmental system, it's not easy to find things to be happy about. We constantly have Senators and Representatives become involved in scandals. These are our highest elected officials, and to say that they're human and make mistakes is one thing, to say that flying to Argentina for a mistress on the state taxpayer's dime falls under the purview of humanism(?) is totally another. Maybe this whole issue I'm having is not totally with the American people, maybe it's with the government who has enable the people to come this way. Or at least to let me perceive them this way.
I'm not embarrassed to be American, I love my country, a bit short of jingoism, but I still love it. I'm just embarrassed by what we've become today. I'm incredibly proud of what we've managed to achieve in less that 200 years, but it seems like we're still chewing the fat instead of innovating the world. Sure we've got the absurdly progressing Silicon Valley projects making computers faster and smarter, but I can't find much else to be happy about with us right now. What do you guys think? Am I right at all? Being melodramatic? Or totally wrong.
I'm having trouble figuring out what I should be proud of as an American anymore. It's easy to fall back on the history books to look for justification, but I want to find something recent. Maybe it's just
I feel that America really has no image in the world today, and that it doesn't do much to deserve a good image anyways. What do you guys think?