Alright, I know that plenty of people may not agree with what I'm about to type here, but if so, this topic isn't for you. What I'm talking about is how a large number of people seem to believe that every person who lives in South Carolina, Texas, Georgia, and any of the other southern-most states (especially on the east coast) are complete and total dumb-fucks and we like to have sex with our cousins/sisters/brothers/etc.
Now, to be fair, no I am not without my prejudices against the Northern States and even some of the Southern States (Alabama and Mississippi come to mind), but I mean really. How offensive are some of the Northern Stereotypes? That you wear suits all the time? That ya make a ball-fuck ton of money and you don't know what to do with it? That your woman are all ninety five percent silicon and syphilis? Granted, that last one was a little insulting, but it doesn't really matter does it? Because people don't take those seriously. However, the Southern image is so firmly engraved in the mind of most Americans that most of what people know is made-up or over exaggerated.
For a couple of examples, I've actually been asked by someone from Rhode Island if we had roads down here. Seriously? I mean really, what kinda fucking backward ass, caveman neanderthal sons of bitches do you think live down here? We've been in possession of the art of road paving for almost as long as the north, so where do you get that information? Another one that I hear a lot of is that we're filled to the brim with members of the KKK, Neo-Nazi's, and the like. I've lived in Georgia for almost all of my life and I can honestly say I've never met anyone that was involved in these.
So can we just cut the stupid shit and drop the stereotypes? That would make me feel a lot less like a dick. Why? Because I won't have to correct every single person who thinks I have sex with cows every saturday before killing them with nothing but a rake, a straw hat, and a screwdriver.
Now, to be fair, no I am not without my prejudices against the Northern States and even some of the Southern States (Alabama and Mississippi come to mind), but I mean really. How offensive are some of the Northern Stereotypes? That you wear suits all the time? That ya make a ball-fuck ton of money and you don't know what to do with it? That your woman are all ninety five percent silicon and syphilis? Granted, that last one was a little insulting, but it doesn't really matter does it? Because people don't take those seriously. However, the Southern image is so firmly engraved in the mind of most Americans that most of what people know is made-up or over exaggerated.
For a couple of examples, I've actually been asked by someone from Rhode Island if we had roads down here. Seriously? I mean really, what kinda fucking backward ass, caveman neanderthal sons of bitches do you think live down here? We've been in possession of the art of road paving for almost as long as the north, so where do you get that information? Another one that I hear a lot of is that we're filled to the brim with members of the KKK, Neo-Nazi's, and the like. I've lived in Georgia for almost all of my life and I can honestly say I've never met anyone that was involved in these.
So can we just cut the stupid shit and drop the stereotypes? That would make me feel a lot less like a dick. Why? Because I won't have to correct every single person who thinks I have sex with cows every saturday before killing them with nothing but a rake, a straw hat, and a screwdriver.