I live in Australia and have never been to America. I don't really like american shows but I've always wanted to travel there. Everything always seems to be in America.
I know whenever we see on the news or on the internet American people doing something stupid someone will always say 'only in America'
I think the people I'm around tend to think of americans as stupid or eccentric but I blame the media for that.
I know whenever we see on the news or on the internet American people doing something stupid someone will always say 'only in America'
I think the people I'm around tend to think of americans as stupid or eccentric but I blame the media for that.