Sleekit said:
Areloch said:
Or do you have some special copy of Star Wars that looks crystal clear on your 4k TV? If you do, you could probably sell it back to Lucasfilms for a pretty penny.
i don't have to have some copy of Star Wars that looks crystal clear on a 4k TV because Lucasfilm/Disney already do...and it'll be stored on
FILM awaiting transfer.
but i can see you didn't bother to heed my request...at all.
Well, they actually don't. Kinda a known deal that they lost/destroyed the original negatives (I think they were turned into the special editions). All we'll ever have are the reproductions. That's one area where digital does far better than film. You can make backups of backups of backups and never lose anything.
That's what I was referring to about selling back to Lucasfilm.
Also, what request? To think about what you said? I like how you say 'don't come up with a counterpoint for your side'. That was neat. And I did cover that. Yes, a 480p video will always be 480p. And that does suck. I concur entirely. But as cameras get better, the pixel density is going to get to the point where it doesn't really actually matter. Unless you get Bigotron-sized displays, your average high-resolution LCD tv will eventually be superseded by the recorded resolution and the issue is nullified.
We're not there yet, but that's the direction we're headed. A 4k display is something like 9megapixels. If you can record film at 32mp or higher, you're officially operating well past sub-pixel resolutions compared to your consumer display. Which means you're achieving the same effect as film. Again, we're definitely not there yet, but we're moving towards it.
Sleekit said:
as for "why you dont just turn down the res on the software/your OS"...flat panel displays only look properly sharp if they are running at 1 for 1 in their native PHYSICAL resolution (or a in a quarterly subdivision of it where 4 physical pixels are used as 1)...whereas on a CRT you could change the resolution of the image projected by the CRT and if you don't understand that...well there's really not much point discussing it further: you don't understand the tech and you don't understand what i'm talking about.
Oh. You're doing THAT.
"Ooh, I made a technical point that really doesn't pragmatically matter, but you just are too dumb to understand the technology"
Don't do that.
Yes, CRT got away with changes to the resolutions made in software/OS, but that's not because they were magic. It's because everything was pre-emptively blurry by virtue of the projection. As you say, there's no additional data being generated. CRT's don't spontaneously create a better image from lower resolution information fed from your computer. All it is is that it's blurry pre-emptively, so it's not quite so noticeable. Even CRT displays will show up as pixelated garbage if you crank the resolution down past where the native blur compensates.
The fact is, if you're lowing the resolution because your system can't handle it, it doesn't have anything to do with the display anymore. Yes, a notch or so step down in resolution may potentially be less noticeable on a CRT than a LCD screen, but only relatively few steps. And if having some pixelation in your display is that much of a huge downer that the slight pixel density difference between the active resolution and the montior's native kills your experience, you would cough up for a machine that could operate at native. While technically a problem, it's only a problem in the most specific of cases.
Most people that have to operate on lower-than-native either don't care that much, or work to fix it. Having a CRT display isn't going to automagically fix how the display looks.