Now this could be just me and a misconception about the entertainment industry in general, but I feel like lately, all the good fiction has been dumped in favor of the more 'grounded' fiction, and by that, I mean more down to Earth....maybe it still doesn't make sense....hmmm...
I guess I would have to say the degree of 'fiction'.
Take a show like Stargate and Sons of Anarchy. Stargate is HEAVILLY reliant on the sci-fi aspect to maintain it's fictionality, while Sons of Anarchy is reliant on it's over the top action and gang drama both while still fictional, has more feasibility than sci-fi technology taking you to new planets lightyears away from us, battling super intelligent parasites and Gods. Both are 'fictional' but I feel like Stargate is MORE fictional but I feel like today's trends in the entertainment industry is to move fiction away from it's fiction....
I don't know if I can describe it any better than that, but I feel like it's not just in shows, but also books and movies.
I can count on two hands the worthwhile ficitonal shows and movies that came out this year that went really out there with the fiction(this may just be a matter of personal opinion though) and Novels seem to start favoring the angsty teenage romance drama and puberty trials as opposed to the developing fictional world and plot and character as a part of all that.
Again though, it may be just me, but I'd at least like to know other people's opinions on this?
Do you think I have any basis for this or am I just spewing shit?
I guess I would have to say the degree of 'fiction'.
Take a show like Stargate and Sons of Anarchy. Stargate is HEAVILLY reliant on the sci-fi aspect to maintain it's fictionality, while Sons of Anarchy is reliant on it's over the top action and gang drama both while still fictional, has more feasibility than sci-fi technology taking you to new planets lightyears away from us, battling super intelligent parasites and Gods. Both are 'fictional' but I feel like Stargate is MORE fictional but I feel like today's trends in the entertainment industry is to move fiction away from it's fiction....
I don't know if I can describe it any better than that, but I feel like it's not just in shows, but also books and movies.
I can count on two hands the worthwhile ficitonal shows and movies that came out this year that went really out there with the fiction(this may just be a matter of personal opinion though) and Novels seem to start favoring the angsty teenage romance drama and puberty trials as opposed to the developing fictional world and plot and character as a part of all that.
Again though, it may be just me, but I'd at least like to know other people's opinions on this?
Do you think I have any basis for this or am I just spewing shit?