aba1 said:
Rawne1980 said:
"Where is the flame" said my 20 year old step daughter when I turned on the electric oven.
I could see someone being confused if they have never heard of a electric stove before. I mean I have never used a electric stove yet and I am 23.
Just to clarify the above: in some places, e.g. older houses in the UK, ovens are for the majority of cases fuelled by natural gas, which for all its dangers is a lot more efficient. Generally the grill / broiler as well.
It's still pretty crazy that someone wouldn't be aware of the existence of electric ovens at that age, but if it's all they've known, and you've just introduced her to a new cooker without pointing out that it's a different type, there's plenty of scope for a forgiveable brainfart as she first notices the lack of an auto-ignite spark button, and then nowhere to manually light it after opening the door...
ANYWAY
Let's get back to what's becoming the main thread here.
Lots of people arguing over high-refresh LCDs whilst clearly not having the first fucking clue how a typical computer video system works, both in the signal generation/transmission, and its display on the screen.
First up, shall we attempt an empirical demonstration? A lot of digital cameras these days offer high-framerate "slow motion" modes, typically at least up to 240fps if not further.
Using your favourite video editing software, make a 60fps video (as low or high rez as you like, it's unimportant), which is a simple alternating-frame of full white and full black (obviously, don't do this if you have photosensitive convulsive epilepsy), or better yet complementary colours like green and magenta, red and cyan, blue and yellow...
Set it to run full screen on yer 60Hz and 120Hz LCD monitors, and also CRTs if you have 'em. Record this display with the highest available framerate on the camera. Put THOSE videos into the editor and play them back frame-by-frame. Can you see a difference between the two? You'll likely see a sort-of tearing effect rolling down the 60Hz ones quite obviously, and depending on the camera's actual recording rate either not at all, or at higher speed / in a more limited fashion on the 120Hz ones.
Because of the nature of video image transmission - analogue or digital, VGA DVI Displayport or HDMI - and of how LCD and LED matrices are addressed, an image going between your video card's VRAM and your eyeballs IS STILL "SCANNED" in some way whilst being rendered. Yes the LCD monitor has a framebuffer, and yes the thin film transistors tied to each picture cell are able to hold their state for a few fractions of a second without flickering or needing to be actively updated, but how does the data get from card to buffer, from buffer to transistors?
A: It's scanned. Rastered. Pumped pixel by pixel, and line by line, and eventually frame by frame (after enough lines of pixels have been sent), down a serial interface between card and buffer, and down a serial-to-semiparallel interface from buffer to transistors. There is a difference that typically an LCD will render a whole line at a time because of the actual electronic organisation of the display hardware, rather than pixel-at-a-time as in a CRT, and it's got at least a one-line delay (which for XGA at 60Hz is all of about 1/50,000th of a second, or 0.02ms) between the video card starting to send data and it appearing on screen (because the buffer has to fill up that line before showing it) instead of it being near-instantaneous and completely pixel-synched...
...but it certainly doesn't render an entire frame all at once, with basically zero latency between frames other than the time taken for the liquid crystals to twist. The framebuffer chip doesn't have infinite bandwidth. It will in fact have just-enough bandwidth. Which for, say, a basic 60Hz, XGA resolution LCD, is about 48Mhz as a minimum, or 144MB/sec for 24-bit true colour. (Actually, if the same line is also responsible for each of the red, green and blue instead of it being split to separate chips, 144Mhz is your baseline)
For a really high end screen, like an Apple Cinemadisplay juiced up to run at 120Hz, it'd need about a 500Mhz (or, 1.5Ghz for combined RGB scanning) framebuffer and 1.5GB/sec memory transfer speed assuming it's not "super high colour". OK, that's maybe not great shakes in the wider computing picture, but in terms of a nowadays relatively cheap peripheral like a monitor, where the price of each component can be crucial, such things class as premium hardware.
Incidentally, you're pushing the limits of HDMI with high-rate rendering as well. It can just about manage 1080p at 120Hz in extended colour depth with the latest versions and specifically designed cables AFAIK, but it was originally only ever meant for 1080i at 30Hz or 720p at 60Hz... It doesn't transmit the whole image at once, but just about squeezes it into one sixtieth or one hundred-twentieth ... by sending the data pixel by pixel, line by line, at relatively very high bandwidth (for a cheap, parallel-wired cable).
Apparently the SXGA monitor on my desk can do 75Hz as well as 60Hz. I think I'll have to get hold of a slo-mo camera and see if it's actually doing it for real - changing the rate at which it updates the panel - or is just up- or down-converting somehow. I figure it's probably using the chip from a larger panel, which can handle higher rates, and always outputs at an effective 75Hz speed... but takes a rest for the equivalent of 1/5th of a frame before starting over when receiving data at 60Hz...
TL;DR version: Don't confuse a flicker-free panel for one that updates rapidly. CRT updates are necessarily 1:1 as they are unbuffered, and scan rate = update rate (apart from some very exotic things like the Commodore 1024-line monitor for the Amiga which had a buffer and received actual image data at about 15Hz down a standard-def cable). LCDs have a buffer, and don't rely on a flickery scanning electron beam in order to actually render the image, so they give a much more stable-seeming image than the CRT. This does not, however, mean that they UPDATE the image any faster, that they're able to, or even that they don't update it in a "scanning" fashion...