Okay, so me and my friend had an arguement about the relevancy of westerns in modern cinema
His arguement is that westerns are irrelevant in cinema and not worth the effort. i am going to quote him to show you what i mean
"i hate horses, people get killed in loads of films and in more realistic ways, they do have terrible special effect (people being pulled on a wire through a window from a single gun shot)"
"westerns are boring and one of the most famous cowboy films is brokeback mountain"
" saw some documentary last week, part of it was why arnt westerns a large part of the film industy anymore, the conclusion being that not nearly as many people are interested in them anymore"
Though the first two come down to him being a homophobic, america hating "insert bad word here", but the last point got me thinking, do you think westerns are no longer relevant?
P.S. this discussion started after he said that its too bad cowboys and aliens will have cowboys in it (just for those who were wondering how such a discuaaion could arise)
His arguement is that westerns are irrelevant in cinema and not worth the effort. i am going to quote him to show you what i mean
"i hate horses, people get killed in loads of films and in more realistic ways, they do have terrible special effect (people being pulled on a wire through a window from a single gun shot)"
"westerns are boring and one of the most famous cowboy films is brokeback mountain"
" saw some documentary last week, part of it was why arnt westerns a large part of the film industy anymore, the conclusion being that not nearly as many people are interested in them anymore"
Though the first two come down to him being a homophobic, america hating "insert bad word here", but the last point got me thinking, do you think westerns are no longer relevant?
P.S. this discussion started after he said that its too bad cowboys and aliens will have cowboys in it (just for those who were wondering how such a discuaaion could arise)