Sleekit said:
atm you can take any film (on film) from any decade of the 20th and can convert it to any format you wish including any future digital media formats that feature a higher digital display resolutions because film doesn't have "a resolution".
This isn't true. Film does have a max resolution as defined by the film grain. It's different depending on the manufacturing and ISO sensitivity (and frame size of course), but it most certainly does have a finite resolution. It's just not a grid like with pixels: blow up a film image too much, and it starts to break down into a collage of discreet chunks of color just like a digital image. It just looks like sand under a magnifying glass, or the pattern on a piece of galvanized metal, rather than the precise grid of a digital raster-based image.
IIRC your average 35mm film frame has a resolution (in terms of grain size relative to frame size) about equivalent to something in the mid 20s in megapixel terms. If you're shooting or displaying digital in 8k, you're getting a resolution more or less analogous to 35mm film. Right now that's expensive, and fairly demanding in terms of processing, write speed, and storage when you're shooting uncompressed footage for a movie (though storage is still much cheaper space wise than an equivalent amount of 35mm film), but in another ten years, or probably even just five, that won't be the case. Digital filmmakers are already shooting in 8k whenever/wherever they can afford to, which means you have to move the goalposts up to IMAX to retain film supremacy when it comes to base resolution in the near future.
In projection, the irregularity of film grain creates a more abstract resolution that feels more "natural" and thus more easily accepted by our animal brains than a precise pixel grid. And with movies, the the fact that each frame has its own random grain structure allows sequential frames to dither into each other more fluidly inside our brains, whereas a pixel grid is consistent across frames and therefore does not get "lost" to persistence of vision like grain does.
It's likely possible to replicate these effects using something like an exotic new sensor technology that captures and/or encodes pictures in an exotic new non-raster format, but it's more
practically likely that ordinary raster resolutions will simply be pushed so high over time that you won't be able to project anything large enough to make a difference (provided you're just not cropping to magnify a small portion of an image) without exceeding the human eye's field of view. Probably just doubling up to 16k would be more than enough to wipe out the difference for 35mm. Dunno what it would take for IMAX.
...But that's assuming no generational degradation when going from the original negatives, to edit print, to distribution print, and so on, which is not the case(extra fun if the film was digitally scanned, then re-printed from processed digital files). The truth is even if you're watching at an early festival showing, you're not even seeing a film print at it's full resolution and clarity, which is why for most ordinary cinemas, digital projection tends to look better than film regardless.
Honestly, resolution isn't a real problem. Compression is. If you have even a 1080p TV at home, and you're not sitting with your face less than a meter away from the screen, any image crappyness you see is probably either bad/flawed compression, or bad/flawed upscaling. Shoddy compression is rampant in the home video world, both online and disc, and it's only gonna get more glaring as resolution goes up. Most stuff online, both video and still images, has been aggressively compressed and/or downsampled to save bandwidth (including stuff like Netflix streaming), so what you see on your computer/TV is 99% of the time not even close to the quality either your monitor/TV or the base image/video formats are actually capable of.