So while board at work today I came across a site where there were a bunch of different travel websites for different countries about things to expect when traveling in the United States. Something odd I noticed in a couple of them (the Japanese one particularly) was that they didn't seem to be too - fond, I want to say? - of women in/from the United States.
This made me think back a little and realize....I don't think I've ever seen or really heard of American women ever really being popular in other countries, or in their media. American men yes (to varying degrees) are depicted in and liked around the world in one degree or another.
But women?
Rarely if ever do they seem to depicted, and when they are, its not well - in fact its a bit of a joke on the Internet it seems to compare American women to women from other countries. But as far as media is concerned; French, German, Russian, Swedish, Japanese, Canadian etc. whatever - all fine and depicted and seemingly liked. But it seems like American women rarely ever appear, and that they aren't terribly popular when they do.
And it brought to mind a video I saw in my sociology class when I was a Junior in college; it was about feminism/women's rights around the world and it had a bunch of interviews with women in France, Japan, Britain, Russia etc. We didn't see the whole video in class, but the part that I remember best was the part towards the end when they went back to interview the French businesswoman (I think she was like a CFO or something) and she was very cutting. I don't remember the words exactly but it was along the lines of "People don't like American women/dealing with American because they've forgotten how to be women/don't teach their daughters how to behave" (Hey, it was 5 years ago almost now. I don't remember the wording exactly, only the impression).
I remember thinking it a bit odd...but does the rest of the world really find women from the US so distasteful?
This made me think back a little and realize....I don't think I've ever seen or really heard of American women ever really being popular in other countries, or in their media. American men yes (to varying degrees) are depicted in and liked around the world in one degree or another.
But women?
Rarely if ever do they seem to depicted, and when they are, its not well - in fact its a bit of a joke on the Internet it seems to compare American women to women from other countries. But as far as media is concerned; French, German, Russian, Swedish, Japanese, Canadian etc. whatever - all fine and depicted and seemingly liked. But it seems like American women rarely ever appear, and that they aren't terribly popular when they do.
And it brought to mind a video I saw in my sociology class when I was a Junior in college; it was about feminism/women's rights around the world and it had a bunch of interviews with women in France, Japan, Britain, Russia etc. We didn't see the whole video in class, but the part that I remember best was the part towards the end when they went back to interview the French businesswoman (I think she was like a CFO or something) and she was very cutting. I don't remember the words exactly but it was along the lines of "People don't like American women/dealing with American because they've forgotten how to be women/don't teach their daughters how to behave" (Hey, it was 5 years ago almost now. I don't remember the wording exactly, only the impression).
I remember thinking it a bit odd...but does the rest of the world really find women from the US so distasteful?