Before I was able to purchase any of the new (HD) consoles, I found myself fantasizing about how graphically impressive games on, you know, the 360 and PS3 would look.
When I finally got to try OUT both systems, I was impressed.
HOWEVER!
Call me old fashioned, but I was under the impression that graphically, the systems would be dependent on Component AV and HDMI to display on 1080i/p TV's and such BECAUSE the graphical power of the consoles made such a system a necessity.
However, after finally purchasing one of the consoles (360), buying a TV with 1080i, and an HDMI cable, playing the games, switching to Component AV cable----
Stop, right there.
I noticed that the GRAPHICS on screen didn't change. No. The brightness and color depth of the screen changed. Pixellation was still there (in some spots), along with no change in game engine display (that I could see).
That made me wonder: was it really a necessity to play the game in 1080i HD? It LOOKED THE SAME IN 480i COMPONENT AV! Just...less bright?
Confused, I tested this on my PS2.
For those who don't know, Gran Turismo 4 (NTSC) has the option to output in 1080i using COMPONENT AV. I thought "ok, let's try this".
I switch it, and start to play. Whats the first thing I notice?
The game looks EXACTLY the same. EXCEPT for a change in BRIGHTNESS and color depth. Everything looked more VIBRANT. Nothing changed graphically!
-----------------------------------------------------------------------------------------------
That said, how come video game companies just don't focus on graphical improvement and nothing else? I mean, theoretically, if we can go from 8 bit to 16 bit to 32 bit to 64 bit to 128 bit to...where are we at now, 256 bit...then why not just do that?
It looks the same. Honestly. I can see how HD works for TV and movies, but video gaming? Really?
When I finally got to try OUT both systems, I was impressed.
HOWEVER!
Call me old fashioned, but I was under the impression that graphically, the systems would be dependent on Component AV and HDMI to display on 1080i/p TV's and such BECAUSE the graphical power of the consoles made such a system a necessity.
However, after finally purchasing one of the consoles (360), buying a TV with 1080i, and an HDMI cable, playing the games, switching to Component AV cable----
Stop, right there.
I noticed that the GRAPHICS on screen didn't change. No. The brightness and color depth of the screen changed. Pixellation was still there (in some spots), along with no change in game engine display (that I could see).
That made me wonder: was it really a necessity to play the game in 1080i HD? It LOOKED THE SAME IN 480i COMPONENT AV! Just...less bright?
Confused, I tested this on my PS2.
For those who don't know, Gran Turismo 4 (NTSC) has the option to output in 1080i using COMPONENT AV. I thought "ok, let's try this".
I switch it, and start to play. Whats the first thing I notice?
The game looks EXACTLY the same. EXCEPT for a change in BRIGHTNESS and color depth. Everything looked more VIBRANT. Nothing changed graphically!
-----------------------------------------------------------------------------------------------
That said, how come video game companies just don't focus on graphical improvement and nothing else? I mean, theoretically, if we can go from 8 bit to 16 bit to 32 bit to 64 bit to 128 bit to...where are we at now, 256 bit...then why not just do that?
It looks the same. Honestly. I can see how HD works for TV and movies, but video gaming? Really?