YouTube Bringing 60 FPS Playback Support

Recommended Videos

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
Six Ways said:
If it's all down to motion blur, why does a 180 degree shutter (50% motion blur) look more "cinematic" than 360 degree shutter (100% motion blur) which looks more cheap and "real" (i.e. sets look fake)? Why does a shot with a static background not look obviously fake in all cases regardless of the lack of any blur? Again, you're oversimplifying. Suspension of disbelief is a complex thing, and unless you're a neuroscientist I doubt you've got any solid basis on which to disregard this.
It doesnt. most props built nowadays are cheapscates knowing that motion blurr with hate most of it, good props wont look fake without motion blur though. "more cinmatic" is a catchphrase that does not mean anything other than "looks as blurry as a movie". as another posted pointed out, motion blur and low framerate is actually detriment to movie immersion and suspension of disbelief because your are constnatly reminded your watching a fast slideshow.

well if you accuse me of not being a neuroscientist may i ask whether you are one?

there is an interesting read where many actual scieentific theories are discussed
http://www.tested.com/art/movies/452387-48-fps-and-beyond-how-high-frame-rates-affect-perception/

Rozalia1 said:
That one...you spin faster at 30 frames? What is it supposed to be telling me?
both 30 and 60 FPS spins at the exact same speed. at 30 fps you see jumps because of missing frames.

Neronium said:
That's pretty much where I'm at. It already takes forever just to render and upload, and I don't need that time to be increased more. If only my CPU was the amazing one my friend has. For me, rendering a 30 minute episode of BioShock 2 at 720p and around 30 FPS takes about an hour and a half to 2 hours. Don't need that time increasing at all that's for sure. :p
another argument i keep hearing lately is that SSD for RAW storage isnt big enough. but i guess that will get mended with time.
though you take that long to render? i use my 5 year old laptop as a render station when i need to recode videos and 720p at 30 FPS is faster than real time. and its a 2x2.4ghz core 2duo processor. sound your your encoder isnt utilizing your CPU properly.

Jadwick said:
ALSO: Can't we just get a user settable frame-limiter on new games, huh? I think the real problem I have is the stuttering.
we do. many games allow v-sync, this syncs your framerate to you monitor refresh rate. if thats not acceptable (some say it introduces input lag) you can always set costum framelimits via your driver settings. for Nvidia users Nvidia Inspector is a great program to do that easily. not sure about AMD GPU settings. also there is tons of programs that do that for you regardless of GPU such as MSI afterburner.
 

Roxas1359

Burn, Burn it All!
Aug 8, 2009
33,758
1
0
Strazdas said:
another argument i keep hearing lately is that SSD for RAW storage isnt big enough. but i guess that will get mended with time.
though you take that long to render? i use my 5 year old laptop as a render station when i need to recode videos and 720p at 30 FPS is faster than real time. and its a 2x2.4ghz core 2duo processor. sound your your encoder isnt utilizing your CPU properly.
No it's not that. It depends on which game is being rendered, and how much footage is there. Not to mention the bit rate of the file causes the render to take longer, and I add a lot of things in rendering, along with upscaling. That's why it takes so long. Hell, I used to render on my laptop before I got my desktop, and it would take between 3 to 5 hours sometimes. Sometimes when I'm also rendering multiple videos at once it'll raise the time it takes.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
Neronium said:
Strazdas said:
another argument i keep hearing lately is that SSD for RAW storage isnt big enough. but i guess that will get mended with time.
though you take that long to render? i use my 5 year old laptop as a render station when i need to recode videos and 720p at 30 FPS is faster than real time. and its a 2x2.4ghz core 2duo processor. sound your your encoder isnt utilizing your CPU properly.
No it's not that. It depends on which game is being rendered, and how much footage is there. Not to mention the bit rate of the file causes the render to take longer, and I add a lot of things in rendering, along with upscaling. That's why it takes so long. Hell, I used to render on my laptop before I got my desktop, and it would take between 3 to 5 hours sometimes. Sometimes when I'm also rendering multiple videos at once it'll raise the time it takes.
ah, i see, my rendering changes pretty simple, mostly jumpcuts, no upscaling. Not sure what bitrate you use, i admit mine was relatively low, but youtube compress it even higher anyway and the other places where i use it usually dont need high bitrates to begin with (such as watching a video on a phone with 640p resolution screen wont benefit much from high bitrate 720p renders opposed to low bitrate ones). Multiple videos at once if course are going to share the tie between them.
 

Roxas1359

Burn, Burn it All!
Aug 8, 2009
33,758
1
0
Strazdas said:
ah, i see, my rendering changes pretty simple, mostly jumpcuts, no upscaling. Not sure what bitrate you use, i admit mine was relatively low, but youtube compress it even higher anyway and the other places where i use it usually dont need high bitrates to begin with (such as watching a video on a phone with 640p resolution screen wont benefit much from high bitrate 720p renders opposed to low bitrate ones). Multiple videos at once if course are going to share the tie between them.
Bit rate fluctuates depending on what my capture card is set to. For example, on the PS3 setting the minimum bit rate is 6 Mbps, while the max is 26 Mbps. But for something like say, the PS2, it's maximum moves down to about 20 being the maximum bit rate and the low one being 4.
For reference, I've been doing BioShock 2 on the PS3, and I just finished recording the base game. For that project, the bit rate started at 21 Mbps (by accident) but for a majority of it I've been recording it at 16 Mbps instead.
 

Six Ways

New member
Apr 16, 2013
80
0
0
Strazdas said:
It doesnt. most props built nowadays are cheapscates knowing that motion blurr with hate most of it, good props wont look fake without motion blur though.
Film-makers don't make sets already knowing which will be in static shots and which will be in moving shots. To make a set which only looks good in moving shots would be incredibly foolish, since the last thing you ever want to do as a film-maker is limit your options once you actually start shooting. Some props are made lower quality for wide shots (lots of lower quality copies in case they get damaged, thrown around etc) with one highly detailed "hero" version for close-ups, but any time you'd see the detail they do use the hero version. The same can't be said for sets, obviously.

"more cinmatic" is a catchphrase that does not mean anything other than "looks as blurry as a movie".
You seem determined to oversimplify things. What about colour grading? What about lighting? What about aspect ratio? What about film grain? What about highlight rolloff? What about depth of field? What about focus pulls? What about steadicam? What about... you get the point. On a slightly different note - what about movies which shoot with a high shutter speed, like Saving Private Ryan? Looks very cinematic to me, despite the almost total lack of motion blur.

as another posted pointed out, motion blur and low framerate is actually detriment to movie immersion and suspension of disbelief because your are constnatly reminded your watching a fast slideshow.
... assuming the uncanny valley effect is incorrect. Which I don't think it is.

well if you accuse me of not being a neuroscientist may i ask whether you are one?
All I'm saying is human perception is complex. You're saying it's not. You don't have to be a neuroscientist to appreciate that there are probably a lot of factors involved.

there is an interesting read where many actual scieentific theories are discussed
http://www.tested.com/art/movies/452387-48-fps-and-beyond-how-high-frame-rates-affect-perception/
The academic they're talking to says some strange things, or perhaps the article has misinterpreted what he means. The article says he refutes the difference in frame-rate making is seem more real. He says "the light reflected from a real-visual scene hits our retinas as a continuous stream", suggesting we can't tell the difference. But, obviously, we can or we wouldn't be having this discussion. So I'm not sure what he's getting at.

One final point. Normally, films are shot at 24fps, with 1/50 exposure time (i.e. every frame is exposed for half the time it appears on screen). The Hobbit was shot at 48fps, but each frame was exposed for 3/4 of the time it's on screen. This means 0.015s rather than 0.02s. So the actual motion blur is only 25% less, not 50% less. Can that 25% really account for all the effects you're talking about?
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
Six Ways said:
Film-makers don't make sets already knowing which will be in static shots and which will be in moving shots. To make a set which only looks good in moving shots would be incredibly foolish, since the last thing you ever want to do as a film-maker is limit your options once you actually start shooting. Some props are made lower quality for wide shots (lots of lower quality copies in case they get damaged, thrown around etc) with one highly detailed "hero" version for close-ups, but any time you'd see the detail they do use the hero version. The same can't be said for sets, obviously.
No, make sets that will look good in both, instead of blurring the image and lowering quality to the point where your bad sets wont matter. But of course that is too much to ask from those 200 million budgets right?

You seem determined to oversimplify things. What about colour grading? What about lighting? What about aspect ratio? What about film grain? What about highlight rolloff? What about depth of field? What about focus pulls? What about steadicam? What about... you get the point. On a slightly different note - what about movies which shoot with a high shutter speed, like Saving Private Ryan? Looks very cinematic to me, despite the almost total lack of motion blur.
Well, you can talk about those, but dont just go around proclaiming that there is some form of "cinematic" way to do things or that its somehow "Better". All of those things you list are very subjective. some people like film grain, others gate it, ect.I know a few people that will refuse to watch movies with shouldercam for example.

High shitter speed is an improvement, but you would also need high frame rate to compliment it. Like i said, its impossible to tell what you mean by cinematic since cinematic is a phrase that is just a buzzword.

... assuming the uncanny valley effect is incorrect. Which I don't think it is.
Wait so you dont think it is correct? or you think it is correct? because it isnt correct. Oculus research into VR has proven that already. Need high and more important steady framerate for immersion.

All I'm saying is human perception is complex. You're saying it's not. You don't have to be a neuroscientist to appreciate that there are probably a lot of factors involved.
No, your saying its simple and we can trick it, im saying that you need far more realistic enviroment to trick it (higher framerate, since, you know, real life is in billions of frames per second, one frame per planck time [http://en.wikipedia.org/wiki/Planck_time]).

The academic they're talking to says some strange things, or perhaps the article has misinterpreted what he means. The article says he refutes the difference in frame-rate making is seem more real. He says "the light reflected from a real-visual scene hits our retinas as a continuous stream", suggesting we can't tell the difference. But, obviously, we can or we wouldn't be having this discussion. So I'm not sure what he's getting at.
I think its just that you dont understand what hes saying. the quite you tell states quite the opposite of your interpretation. since real light is a continuos stream and not stuttering in low framerate, means we can see any distruption from the continuum, as in, we can differentiate frame speeds.

Can that 25% really account for all the effects you're talking about?
Why not? Film industry was perfecting a way to make props "just barelly good enough" cheaply, and once you remove a quarter of the blur hiding poor props, people start noticing them? especially in a movie that is very heavy in props like The Hobbit?
 

Six Ways

New member
Apr 16, 2013
80
0
0
Strazdas said:
No, make sets that will look good in both, instead of blurring the image and lowering quality to the point where your bad sets wont matter. But of course that is too much to ask from those 200 million budgets right?
I don't follow your point. I'm saying set design doesn't change depending on whether shots will be static or moving, yet static shots don't look terrible (so therefore set detail is not necessarily the major part of the problem).

Well, you can talk about those, but dont just go around proclaiming that there is some form of "cinematic" way to do things or that its somehow "Better". All of those things you list are very subjective. some people like film grain, others gate it, ect.I know a few people that will refuse to watch movies with shouldercam for example.
I never said anything was 'better'; all I've been saying is these things are subjective and higher frame rate is not objectively better. Some prefer it, some don't.

High shitter speed is an improvement, but you would also need high frame rate to compliment it.
But high shutter speed has no motion blur. Less, in fact, than the Hobbit. So if you claim motion blur is everything, why do you still need higher frame rate?

it isnt correct. Oculus research into VR has proven that already. Need high and more important steady framerate for immersion.
Immersion in an interactive experience; an experience that we're actively trying to convince the brain is real. Movies aren't doing that - they're maintaining a 'veil' between the audience and the movie. The psychology of VR and movies cannot be compared - the goals are different, and movies have none of the tricks that VR does. True stereoscopy, wide FOV and head tracking. Without any one of those, you don't have "immersion" in the VR sense. Your brain always knows it's watching a screen. In VR it doesn't.

No, your saying its simple and we can trick it, im saying that you need far more realistic enviroment to trick it
How is that what I'm saying? I'm saying 24fps has one effect, 48 has another. And the reasons are complex. You're saying the single, simple reason is bad sets.

since real light is a continuos stream and not stuttering in low framerate, means we can see any distruption from the continuum, as in, we can differentiate frame speeds.
Re-reading it, I think you're right. However in that case it doesn't have much bearing one way or the other surely? I mean, clearly there is still a practical 'maximum' frame rate the eye can see (although it varies), since some people can see flourescent lights flicker and some can't.

Why not? Film industry was perfecting a way to make props "just barelly good enough" cheaply, and once you remove a quarter of the blur hiding poor props, people start noticing them? especially in a movie that is very heavy in props like The Hobbit?
I don't really know where you're getting this from. On what basis are you claiming that the film industry makes sets 'just barely good enough'? Besides which, I simply don't believe that effectively increasing resolution by 25% (in moving shots only) is enough to shatter suspension of disbelief for so many people. 50%, maybe. SD to HD (an increase of about 400%!), probably. But not 25%.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
Six Ways said:
I don't follow your point. I'm saying set design doesn't change depending on whether shots will be static or moving, yet static shots don't look terrible (so therefore set detail is not necessarily the major part of the problem).
some static sets look terrible. For example in latest game of thrones the overhead shot of King's Landing (i believe episode 4x02) looked like some cheap play model. Also movement allows us to see props in multiple angles and also we pay more attention to movement. Its not that they look terrible, its that we notice more.
I never said anything was 'better'; all I've been saying is these things are subjective and higher frame rate is not objectively better. Some prefer it, some don't.
What you listed is subjective. Higher framerete is objective, because it is not a technique of filming, its simply more picture, more fluid motion. Seriously, are you mixing things up like steadycam and framerate as same thing?

But high shutter speed has no motion blur. Less, in fact, than the Hobbit. So if you claim motion blur is everything, why do you still need higher frame rate?
No, motion blur is not everything. Motion blur helps to hide low framerate problem. The low framerate itself is the problem, motion blur is a symptom.

Immersion in an interactive experience; an experience that we're actively trying to convince the brain is real. Movies aren't doing that - they're maintaining a 'veil' between the audience and the movie. The psychology of VR and movies cannot be compared - the goals are different, and movies have none of the tricks that VR does. True stereoscopy, wide FOV and head tracking. Without any one of those, you don't have "immersion" in the VR sense. Your brain always knows it's watching a screen. In VR it doesn't.
Movies dont try to be immersive? maybe few selected cases and 4th wall breaking ones. Whats next, your going to tell me movies arent made to make money?

How is that what I'm saying? I'm saying 24fps has one effect, 48 has another. And the reasons are complex. You're saying the single, simple reason is bad sets.
Wow what a way to misinterpret my post. I listed bad sets being more visible as one of the downsides of 48 frames, not a singular reason it looks different.

Re-reading it, I think you're right. However in that case it doesn't have much bearing one way or the other surely? I mean, clearly there is still a practical 'maximum' frame rate the eye can see (although it varies), since some people can see flourescent lights flicker and some can't.
Practical maximum, yes. what that maximum is is yet to be determined, because every test we done ended up with us not having screen with refresh rate fast enough so humans wont notice. i believe the record was 210 frames per second being noticable.

I don't really know where you're getting this from. On what basis are you claiming that the film industry makes sets 'just barely good enough'? Besides which, I simply don't believe that effectively increasing resolution by 25% (in moving shots only) is enough to shatter suspension of disbelief for so many people. 50%, maybe. SD to HD (an increase of about 400%!), probably. But not 25%.
By basis of economic logic. Why spend more if you can spend less for same results? People whose main prupose is to make profit knows this principle very well, and it had well over 100 years to perfect it.

25% is quite a lot if were looking at large raw numbers. and people do notice resolution changes. they noticed 792p vs 720p for Ryse.
 

Six Ways

New member
Apr 16, 2013
80
0
0
Strazdas said:
some static sets look terrible. For example in latest game of thrones the overhead shot of King's Landing (i believe episode 4x02) looked like some cheap play model. Also movement allows us to see props in multiple angles and also we pay more attention to movement. Its not that they look terrible, its that we notice more.
Certainly some sets look bad. Most don't, at least on high end productions like GoT or summer blockbuster movies. And we don't pay more attention to movement of background; we pay attention to movement of the subject. If you're claiming that people notice defects more in a shot where the background is moving, compared to one where it's static and they can see it for a while in perfect clarity, I can't agree with you.

What you listed is subjective. Higher framerete is objective, because it is not a technique of filming, its simply more picture, more fluid motion. Seriously, are you mixing things up like steadycam and framerate as same thing?
Equally, I could claim steadycam is 'objectively' better because it's smoother than handheld. I could claim that brightly lit scenes are 'objectively' better than dark, noir-ish scenes because there's 'more picture'. I could claim modern AAA games 'objectively' look better than pixel art because the resolution is higher and it's more realistic. There is no 'objective' when it comes to the appraisal of art. Not for frame rate or anything else.

No, motion blur is not everything. Motion blur helps to hide low framerate problem. The low framerate itself is the problem, motion blur is a symptom.
Well, that exactly contradicts everything you've said. Your claim is that things look bad because you can see they're fake, and the reason is lack of motion blur. You said:
we see things as obviuosly fake when they are obviuosly fake, regardless of framerate. its just that motion blur sometimes allow things to be blurred out enough for us to not be able to see
And if framerate is the 'cause', then just moving to a higher framerate should fix it. Which clearly it doesn't, because as we've established the Hobbit looks fake to some people.

Movies dont try to be immersive? maybe few selected cases and 4th wall breaking ones. Whats next, your going to tell me movies arent made to make money?
They don't try to be "VR immersive". They can't be. Like I say, you can't take results from VR and apply them verbatim to other media, it's completely different.

Wow what a way to misinterpret my post. I listed bad sets being more visible as one of the downsides of 48 frames, not a singular reason it looks different.
Again, the quote I've taken above makes it pretty clear that you think motion blur is the single major factor.

By basis of economic logic. Why spend more if you can spend less for same results? People whose main prupose is to make profit knows this principle very well, and it had well over 100 years to perfect it.
So, a priori. Have you taken into account that the people making sets are craftsmen who will often be invested in their work? Budget does not map 1:1 with quality of set. You don't get 10% less quality for 10% less budget. I think you're making a very blanket statement about something which in reality is very nuanced and will change hugely from production to production. Furthermore, if there's one film where I would expect absolutely every single person involved to be hugely invested, it's the Hobbit. So I would say that if they can't make a set look good under 48p, no-one can.

25% is quite a lot if were looking at large raw numbers. and people do notice resolution changes. they noticed 792p vs 720p for Ryse.
I didn't say they won't notice. I said it's not enough to have such a binary effect (real vs fake).
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
Six Ways said:
Certainly some sets look bad. Most don't, at least on high end productions like GoT or summer blockbuster movies. And we don't pay more attention to movement of background; we pay attention to movement of the subject. If you're claiming that people notice defects more in a shot where the background is moving, compared to one where it's static and they can see it for a while in perfect clarity, I can't agree with you.
I claimed that we pay more attention to props that move, and by paying more attention we spot flaws more easily.

Equally, I could claim steadycam is 'objectively' better because it's smoother than handheld. I could claim that brightly lit scenes are 'objectively' better than dark, noir-ish scenes because there's 'more picture'. I could claim modern AAA games 'objectively' look better than pixel art because the resolution is higher and it's more realistic. There is no 'objective' when it comes to the appraisal of art. Not for frame rate or anything else.
No, you could not. Because what you talk about is filming techniques and styles and not quality of the image. Altrough most would agree that well lit scenes are objectively better than blakc screen with only sound.

Art is subjective, however resolution and framerate is not part of that, they are different thing.

Well, that exactly contradicts everything you've said. Your claim is that things look bad because you can see they're fake, and the reason is lack of motion blur. You said:
All it contradicts is your misinterpretation. Higher framerate allows us to see more clearly. by seeing more clearly we can spot flaws more easily. Motion blur hides those flaws, thus less motion blue will hide them less - we will see them better. The fault is with bad props here. Solution is to make better props.

So, a priori. Have you taken into account that the people making sets are craftsmen who will often be invested in their work? Budget does not map 1:1 with quality of set. You don't get 10% less quality for 10% less budget. I think you're making a very blanket statement about something which in reality is very nuanced and will change hugely from production to production. Furthermore, if there's one film where I would expect absolutely every single person involved to be hugely invested, it's the Hobbit. So I would say that if they can't make a set look good under 48p, no-one can.
yes, craftmen ability differences will create variation, but in the scale of all movies put together the difference is insifnificant.

Also your assuming The Hobbit isnt just a cash-grab?

I didn't say they won't notice. I said it's not enough to have such a binary effect (real vs fake).
and i never claimed it was.
 

Six Ways

New member
Apr 16, 2013
80
0
0
Strazdas said:
I claimed that we pay more attention to props that move, and by paying more attention we spot flaws more easily.
And that people don't notice problems in static backgrounds that stay there for a while in total clarity. Which I can't possibly agree with. People do look at wide shots, you know.

Because what you talk about is filming techniques and styles and not quality of the image.
What about film grain? An 'image quality' factor used artistically. And in what sense does shakey camera not impact image quality?

Art is subjective, however resolution and framerate is not part of that, they are different thing.
Of course they're not. Film-makers often use half-framerate slowmotion for a particular effect, as an artistic choice. Retro games use lower resolution artistically. Musicians use bitcrushers and sample rate reducers as effects.

Look, if you're going to keep claiming that there is no single possible situation in which a person could subjectively prefer the aesthetic of lower fidelity, there's no point arguing with you on that point. You're arguing for the sake of it.

Motion blur hides those flaws, thus less motion blue will hide them less - we will see them better. The fault is with bad props here. Solution is to make better props.
Then you'll have to answer, again, why Saving Private Ryan doesn't look fake.

yes, craftmen ability differences will create variation, but in the scale of all movies put together the difference is insifnificant.
We're not talking about "the scale of all movies put together". We're talking about the fact that the Hobbit looks fake, despite having the best set-designers on the planet.

I didn't say they won't notice. I said it's not enough to have such a binary effect (real vs fake).
and i never claimed it was.
Uh, yes, you did. You claimed that lack of motion blur is why people think the Hobbit looks fake. The difference in motion blur is 25%. Therefore you claimed that 25% less motion blur is enough to go from "real" to "fake".
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
Six Ways said:
And that people don't notice problems in static backgrounds that stay there for a while in total clarity. Which I can't possibly agree with. People do look at wide shots, you know.
so you claim that people are as likely to notice flaws with props in background rather than moving objects that are the foreground?

What about film grain? An 'image quality' factor used artistically. And in what sense does shakey camera not impact image quality?
you mean artificial film grain created in post-processing? Thats just lowering your quality for no reason. its not used "Artistically". its used because nostalgia of some viewers. Or to fakean effect of "Seeing a video within a movie".
Shoulder camera does not inpact image quality. by shakey camera i guess you mean those techniques used in lost footage films? they do not get lower quality because of it, it does look worse because its easier to notice the problems when in movement, true. they do that to attempt to provide immmersion, trying to trick you it was actually filmed on bad handheld camera. it even worked for a few movies (such as Blair Witch Project).

Look, if you're going to keep claiming that there is no single possible situation in which a person could subjectively prefer the aesthetic of lower fidelity, there's no point arguing with you on that point.
Subjectively prefer bad quality - yes such people may exist. Objectively though, no.

Then you'll have to answer, again, why Saving Private Ryan doesn't look fake.
What makes you think it doesnt.

We're not talking about "the scale of all movies put together". We're talking about the fact that the Hobbit looks fake, despite having the best set-designers on the planet.
Best set designers on the planet? yeah im going to need you to prove that. and even if that was true that changes nothing. you can take the best metal worker on the planet but if all he has to work with is plastic he wont make you a good item.

Uh, yes, you did. You claimed that lack of motion blur is why people think the Hobbit looks fake. The difference in motion blur is 25%. Therefore you claimed that 25% less motion blur is enough to go from "real" to "fake".
No. Cite me.
 

Six Ways

New member
Apr 16, 2013
80
0
0
Strazdas said:
so you claim that people are as likely to notice flaws with props in background rather than moving objects that are the foreground?
I claim that in static wide shots, if the set looks fake, people notice.

you mean artificial film grain created in post-processing?
Or real film grain from real film stocks, chosen for to their specific grain structure...
Thats just lowering your quality for no reason. its not used "Artistically". its used because nostalgia of some viewers.
Or because, you know, some people like the look of it.
they do that to attempt to provide immmersion, trying to trick you it was actually filmed on bad handheld camera. it even worked for a few movies (such as Blair Witch Project).
So you're saying that (the low quality used in the Blair Witch Project) is not an 'artistic choice'? If so, then what on earth qualifies as an 'artistic choice' to you? You've ruled out craftsmanship (aiding immersion, implying a bad handheld camera), evoking an atmosphere or emotion ('nostalgia'), and no functional purpose ('no reason'). So what's left that would be 'artistic' exactly? You're just changing your definitions to support your argument that it can't possibly be an 'artistic choice'.

Subjectively prefer bad quality - yes such people may exist. Objectively though, no.
My god. I just gave you loads of examples of 'objectively worse quality' things that people like. Retro games. Lo-fi music. 'Such people may exist'? I cant believe you're still defending this point.

Then you'll have to answer, again, why Saving Private Ryan doesn't look fake.
What makes you think it doesnt.
My personal experience, and the widespread opinion that it's an incredibly good movie.

Best set designers on the planet? yeah im going to need you to prove that. and even if that was true that changes nothing. you can take the best metal worker on the planet but if all he has to work with is plastic he wont make you a good item.
Ok, fair enough, to a point. But your claim has exactly the same footing as mine - you think set design is bad (because you've decided that's why 48fps look bad) and I don't. I don't see how either of us can 'prove' that.

Uh, yes, you did. You claimed that lack of motion blur is why people think the Hobbit looks fake. The difference in motion blur is 25%. Therefore you claimed that 25% less motion blur is enough to go from "real" to "fake".
No. Cite me.
Dude. That's your whole argument.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
Six Ways said:
Strazdas said:
so you claim that people are as likely to notice flaws with props in background rather than moving objects that are the foreground?
I claim that in static wide shots, if the set looks fake, people notice.
And i dont disagree, merely claim that they notice more often if the object is moving and in foreground.

Or real film grain from real film stocks, chosen for to their specific grain structure...
So people chose worse quality film for no other reason (such as it being cheaper for example) other than fiolm grain? why intentionally lower quality of your image? Can you name people that do this so i could avoid giving them any of my money, ever?

So you're saying that (the low quality used in the Blair Witch Project) is not an 'artistic choice'? If so, then what on earth qualifies as an 'artistic choice' to you? You've ruled out craftsmanship (aiding immersion, implying a bad handheld camera), evoking an atmosphere or emotion ('nostalgia'), and no functional purpose ('no reason'). So what's left that would be 'artistic' exactly? You're just changing your definitions to support your argument that it can't possibly be an 'artistic choice'.
Low quality is not artistic choice, with Blair Witch being exception as low quality here was meant for immersion effect. You have already named many artistic choices, such as lighting, shot angles, ect.

My god. I just gave you loads of examples of 'objectively worse quality' things that people like. Retro games. Lo-fi music. 'Such people may exist'? I cant believe you're still defending this point.
And these people like it for subjective reasons.

My personal experience, and the widespread opinion that it's an incredibly good movie.
Which is irrelevant to what you were asking. Good movie =/= movie looking non-fake.

Ok, fair enough, to a point. But your claim has exactly the same footing as mine - you think set design is bad (because you've decided that's why 48fps look bad) and I don't. I don't see how either of us can 'prove' that.
Wasnt me that decided that. That was Jackson who said their sets werent made to be used for 48 fps which they fixed in second movie. Obviously, i cant go bring you a set from hobbit and prove it, you probably should ask Jackson for that, doubt he would bother though.

Dude. That's your whole argument.
No. My argument was that higher framerate allows us to see more and thus results in better quality/experience. Motion blur is effect that we use to mask bad quality of 24 frames per second. you were the one that claimed the 25% motion blur change. in fact i didnt even knew the shutter time before you mentioned it here.
 

Six Ways

New member
Apr 16, 2013
80
0
0
Right, look. All the goalposts are getting moved here. Here are my points.

People can notice problems with sets in static shots. But, unlike the Hobbit, 24fps films don't cause widespread reports of being yanked out of the experience and it feeling (not looking) fake. That's true of awful films and great films alike. I can watch a film with awful sets and yeah, it looks awful - but I don't get the same jarring feeling the Hobbit gives.

You claim Saving Private Ryan looks (or may look) fake. But show me the eruption of reviews where reviewers and audiences were yanked out of the experience by a psychologically jarring effect. I can show you them for the Hobbit, and they did not happen for Saving Private Ryan, despite SPR having less motion blur than the Hobbit. Ergo, motion blur is not the cause, and higher frame-rate likely is.

Finally - you agree that people can like lower-quality things subjectively. Then deny that lower quality can be used artistically. That is a contradiction.

If people like lower quality, you can tell them they should like 'objectively' higher quality til you're blue in the face, but they still won't. Go and tell all the retro game enthusiasts that they're idiots for liking gameboys when they could have CoD, see how that goes.