Next gen game graphics are hugely unimpressive

Recommended Videos

Dirty Hipsters

This is how we praise the sun!
Legacy
Feb 7, 2011
8,802
3,383
118
Country
'Merica
Gender
3 children in a trench coat
So there has been a trend on The Escapist, and some other websites, where many detractors of the next gen consoles (the Xbox One and PS4) keep pointing out how similar the next generation games look to games in the current gen.

To those people the point of this argument is to say something along the lines of "it doesn't matter how powerful the xbox one or the PS4 are, just look at the games, the graphics don't look any different from current gen stuff so there's no point in spending money for a new console when we're not even seeing any significant upgrades."

This thread is me telling those people that they're right, to an extent, but the fact that they're right doesn't matter.

See, I agree that many of the next gen console games don't look hugely different or better graphically from the current generation of console games. Titanfall's graphics don't look hugely superior to Killzone 3, inFamous Second Son's graphics don't look hugely superior to inFamous 2's graphics. Thing is, there's a reason for that, it's because this are LAUNCH TITLES.

Launch titles for new consoles don't tend to have hugely visible differences from the previous console generation, because developers of those launch titles haven't had very long to build their games and learn to program for the new architecture of the next gen consoles. They don't know how to squeeze every drop of power out of the ram yet, they don't know the limitations of the graphics cards, they don't know how to optimize their games.

The launch titles for the previous generation of consoles, the xbox 360 and the ps3 didn't look vastly different from xbox and ps2 titles either. Take a look at something like Perfect Dark Zero for example. While Perfect Dark Zero has new lighting, shadows, and particle effects that weren't possible on the original xbox, these little improvements are barely noticeable. In all, Perfect Dark Zero doesn't look much better than something like Halo 2. Now compare Perfect Dark Zero to something that's come out in the last year or two and there's a greater difference between the Xbox 360 launch title, and the new xbox 360 title than there is between the xbox original title and xbox 360 launch title.

That's how it goes, launch titles don't show off the full capabilities of the hardware, they aren't indicative of the games we'll be getting even a year into the hardware cycle, and they definitely aren't showing the limitations, or even perceived limitations, of the hardware. So if you aren't impressed by the games launching on the Xbox One and PS4 that doesn't mean that there's no point to buying either console because there will be huge improvements, better textures, better lighting and shadows, better AI and a greater amount of AIs on screen at once, greater depth of field, more detailed animations, etc. you just have to wait a little for developers to catch up to the potential.
 

scorptatious

The Resident Team ICO Fanboy
May 14, 2009
7,405
0
0
I can agree with this.

Looking back, a lot of the PS2 launch titles didn't look all that different from a late PS1 title.

Same sorta went with games like Oblivion when it first came out. Compared to Skyrim, Oblivion looks pretty dated IMO.

But yeah, I am probably going to wait until there are more games that come out that I'll want before I purchase a PS4.
 

FootloosePhoenix

New member
Dec 23, 2010
313
0
0
I wouldn't say HUGELY unimpressive. Is it a massive leap forward? No. But I can certainly tell the difference between current-gen graphics and next-gen. Even games that are pushing current-gen consoles to their absolute limits right now, like BioShock Infinite and The Last of Us, don't look as gorgeous as Second Son or Watch_Dogs, for instance. I didn't think I'd be impressed at all by them at first, but the amount of detail and realism I've been seeing so far is incredible. Of course, I'll always consider art direction to be way more important than raw graphical power; Deep Down doesn't look too nice to me, but it's still obviously doing its best to utilize the new hardware's capabilities at this point.

But overall your point is sound and I agree for the most part. It takes time for developers to uncover what a console can do.
 

Sniper Team 4

New member
Apr 28, 2010
5,433
0
0
Graphics rarely impress me for more than a few minutes. After that, my brain adjusts and I start focusing on things in game that are important to me. Mainly the story. I've never cared that I can see a single dribble of sweat running down a man's cheek for two seconds of gameplay, because odds are during those two seconds I'm going to be busy playing the game, not admiring the scenery. Graphics are pretty, but they've never been a selling point on consoles for me. Although I do enjoy going back and playing old games and going, "Wow, to think I used to think this was state-of-the-art." Final Fantasy VIII's intro is still amazing to me, but I swear it look a lot better when I was kid.
 

Dragonbums

Indulge in it's whiffy sensation
May 9, 2013
3,307
0
0
I would say you are half right in this.
While yes, later in the generation game graphics will get better eventually.
However the stark differences will be minimal.
I mean, when I saw the first Assassins Creed game (without knowing what it was at the time) I was under the impression that my friend was watching a movie. That was on the Xbox 360. What more can you possibly get out of graphical fidelity at this point? There will come a point where that will become a moot point in people buying a console and they now look for other things. And honestly saying smarter AI isn't really going to cut it for a lot of people who don't really care all that much, and only applies to the games that actually benefit from having intelligent AI in the first place.
 

Dirty Hipsters

This is how we praise the sun!
Legacy
Feb 7, 2011
8,802
3,383
118
Country
'Merica
Gender
3 children in a trench coat
Dragonbums said:
I would say you are half right in this.
While yes, later in the generation game graphics will get better eventually.
However the stark differences will be minimal.
I mean, when I saw the first Assassins Creed game (without knowing what it was at the time) I was under the impression that my friend was watching a movie. That was on the Xbox 360. What more can you possibly get out of graphical fidelity at this point? There will come a point where that will become a moot point in people buying a console and they now look for other things. And honestly saying smarter AI isn't really going to cut it for a lot of people who don't really care all that much, and only applies to the games that actually benefit from having intelligent AI in the first place.
While assassin's creed one was a great looking game, there is no way I can believe that you thought it was a movie, not unless you have really poor eyesight. If you get in close to pretty much any texture in Assassin's Creed it looks pretty bad, just like textures in any big open world game.

Also, exactly what kind of game wouldn't be improved by having better AI? Shooters? Improved by better AI. Stealth games? Improved by better AI. Survival horror games? Improved by better AI. Sports games? Improved by better AI.

The only games that wouldn't be improved by having smarter AI are games where there are no enemies.
 

Hawkeye 131

New member
Jun 2, 2012
142
0
0
I too also concur good sir! So far I'm really impressed with the visuals of The Witcher 3: Wild Hunt, Killzone Shadowfall and The Division. I think some people feel that the current launch titles only look marginally better than this generations high-end titles and maybe that is compounded by fact that the majority of these next-gen titles are running natively at 1080p at a solid 30 FPS instead of 30 FPS at 720p.

The next-gen consoles do have a lot of power when compared to their predecessors, so with that power these new games are getting all the cool features that have become more and more common place among many of today's AAA PC games. Things like native 1080p at 60 fps, high resolution textures, particle effects, high range dynamic lighting, better anti-alaising techniques, post-processing, texture filtering, tesselation, PhysX etc... People who for the past 8 years have grown used to seeing games "high-end" console games that barely hit 30 FPS at 720p (The Last of Us, Borderlands 2 and Halo 4, for example), are seeing these new titles like Battlefield 4 or Killzone on PS4 looking and sporting features that are similar or very close to their PC counter parts and maybe being blown away, mildly impressed or like many others simply don't care about good graphics.

I agree with the OP though, high-end games will probably only get better and better looking as this console generation transitions and begins to move forward.

-Hawk
 

Foolery

No.
Jun 5, 2013
1,714
0
0
Diminishing returns anyway. Graphics should be the last thing to focus on when it comes to game design. I'm more interested in what kind of new open worlds or AI can be created on next-gen hardware.
 

Dragonbums

Indulge in it's whiffy sensation
May 9, 2013
3,307
0
0
Dirty Hipsters said:
Dragonbums said:
I would say you are half right in this.
While yes, later in the generation game graphics will get better eventually.
However the stark differences will be minimal.
I mean, when I saw the first Assassins Creed game (without knowing what it was at the time) I was under the impression that my friend was watching a movie. That was on the Xbox 360. What more can you possibly get out of graphical fidelity at this point? There will come a point where that will become a moot point in people buying a console and they now look for other things. And honestly saying smarter AI isn't really going to cut it for a lot of people who don't really care all that much, and only applies to the games that actually benefit from having intelligent AI in the first place.
While assassin's creed one was a great looking game, there is no way I can believe that you thought it was a movie, not unless you have really poor eyesight. If you get in close to pretty much any texture in Assassin's Creed it looks pretty bad, just like textures in any big open world game.

Also, exactly what kind of game wouldn't be improved by having better AI? Shooters? Improved by better AI. Stealth games? Improved by better AI. Survival horror games? Improved by better AI. Sports games? Improved by better AI.

The only games that wouldn't be improved by having smarter AI are games where there are no enemies.
It's not like I shoved my face into the screen. I just happened to pass by at the time and look at the screen before heading off with my friends. Not before saying "what kind of movie is that" where he promptly told me it was a videogame.

On that note, I feel that many AI are pretty good at what they are now. Unless the devs just didn't give as much of a crap for AI smarts-even today I can find games where the AI get smarter the higher the difficulty is.

I mean when I play the Battle Tower in Pokemon, the more win streaks I get, the more smarter the AI become. To the point where you actually have to know what your up against, which Pokemon to send out, how good are those Pokemon's stats compared to the other, etc. There is a reason why most people can't get the 100 win streak. The AI get so smart in the battles it's insane.
We already have the tech to have superior AI. However game devs don't put that much resources into smarter AI. They simply make them passable and focus on something else.
 

Dirty Hipsters

This is how we praise the sun!
Legacy
Feb 7, 2011
8,802
3,383
118
Country
'Merica
Gender
3 children in a trench coat
Dragonbums said:
Dirty Hipsters said:
Dragonbums said:
I would say you are half right in this.
While yes, later in the generation game graphics will get better eventually.
However the stark differences will be minimal.
I mean, when I saw the first Assassins Creed game (without knowing what it was at the time) I was under the impression that my friend was watching a movie. That was on the Xbox 360. What more can you possibly get out of graphical fidelity at this point? There will come a point where that will become a moot point in people buying a console and they now look for other things. And honestly saying smarter AI isn't really going to cut it for a lot of people who don't really care all that much, and only applies to the games that actually benefit from having intelligent AI in the first place.
While assassin's creed one was a great looking game, there is no way I can believe that you thought it was a movie, not unless you have really poor eyesight. If you get in close to pretty much any texture in Assassin's Creed it looks pretty bad, just like textures in any big open world game.

Also, exactly what kind of game wouldn't be improved by having better AI? Shooters? Improved by better AI. Stealth games? Improved by better AI. Survival horror games? Improved by better AI. Sports games? Improved by better AI.

The only games that wouldn't be improved by having smarter AI are games where there are no enemies.
It's not like I shoved my face into the screen. I just happened to pass by at the time and look at the screen before heading off with my friends. Not before saying "what kind of movie is that" where he promptly told me it was a videogame.

On that note, I feel that many AI are pretty good at what they are now. Unless the devs just didn't give as much of a crap for AI smarts-even today I can find games where the AI get smarter the higher the difficulty is.

I mean when I play the Battle Tower in Pokemon, the more win streaks I get, the more smarter the AI become. To the point where you actually have to know what your up against, which Pokemon to send out, how good are those Pokemon's stats compared to the other, etc. There is a reason why most people can't get the 100 win streak. The AI get so smart in the battles it's insane.
We already have the tech to have superior AI. However game devs don't put that much resources into smarter AI. They simply make them passable and focus on something else.
Wow, they manage to make somewhat smart AI in a one on one turn based strategy game? Tell me more about how advanced that is. That's not particularly impressive. It's not that much different than programming a computer to play chess, something that's taught in pretty much any more advanced programming course.

What I'm talking about is AI that react in real time, and not just one, but groups of them. Imagine a shooter where instead of every level just being a shooting gallery where you kill hundreds of faceless mooks who run at you with no regard for their safety you instead have to face hundreds of enemy soldiers who are smart, who adapt different strategies flank you as a group, react to the different weapons you're using, anticipate how you're going to use cover, make use of covering fire, etc. Current generation hardware tends to chug when you have more than 15 or 20 characters fighting on screen at once, now imagine increasing that number by a factor of 10, and having smart, independent AI for each one. It has the potential to be amazing.
 

WouldYouKindly

New member
Apr 17, 2011
1,431
0
0
See, now if consoles had been working with standard PC architecture for the past generation, I think you'd have seen a fairly noticeable leap in graphical quality.
Dead Century said:
Diminishing returns anyway. Graphics should be the last thing to focus on when it comes to game design. I'm more interested in what kind of new open worlds or AI can be created on next-gen hardware.
So you'd be fine with things being text adventures or looking like Dwarf Fortress?

No, graphics have a place, like advertising has a place, much to my chagrin. The presentation of your work can vastly effect it's initial reception. After all, go find a short gameplay video of say, Bioshock Infinite. Then take a look at the most recent uberturd, Ride to Hell Retribution and tell me you'd buy the second one based on a preliminary on how the graphics look.

It's like why a dealership polishes all of it's cars. It's not necessary to their function, but it can really help the first impression they make. Good first impressions can improve sales vastly.

That being said, you don't stop buying new cars or hiring the right employees in order to pay for car polish, but it serves a purpose. Do it when it's in the budget. If it's not in the budget, take a look at the budget and see if it's even possible to allocate the necessary resources and still make something good with lacking graphics.
 

Dragonbums

Indulge in it's whiffy sensation
May 9, 2013
3,307
0
0
Dirty Hipsters said:
Dragonbums said:
Dirty Hipsters said:
Dragonbums said:
I would say you are half right in this.
While yes, later in the generation game graphics will get better eventually.
However the stark differences will be minimal.
I mean, when I saw the first Assassins Creed game (without knowing what it was at the time) I was under the impression that my friend was watching a movie. That was on the Xbox 360. What more can you possibly get out of graphical fidelity at this point? There will come a point where that will become a moot point in people buying a console and they now look for other things. And honestly saying smarter AI isn't really going to cut it for a lot of people who don't really care all that much, and only applies to the games that actually benefit from having intelligent AI in the first place.
While assassin's creed one was a great looking game, there is no way I can believe that you thought it was a movie, not unless you have really poor eyesight. If you get in close to pretty much any texture in Assassin's Creed it looks pretty bad, just like textures in any big open world game.

Also, exactly what kind of game wouldn't be improved by having better AI? Shooters? Improved by better AI. Stealth games? Improved by better AI. Survival horror games? Improved by better AI. Sports games? Improved by better AI.

The only games that wouldn't be improved by having smarter AI are games where there are no enemies.
It's not like I shoved my face into the screen. I just happened to pass by at the time and look at the screen before heading off with my friends. Not before saying "what kind of movie is that" where he promptly told me it was a videogame.

On that note, I feel that many AI are pretty good at what they are now. Unless the devs just didn't give as much of a crap for AI smarts-even today I can find games where the AI get smarter the higher the difficulty is.

I mean when I play the Battle Tower in Pokemon, the more win streaks I get, the more smarter the AI become. To the point where you actually have to know what your up against, which Pokemon to send out, how good are those Pokemon's stats compared to the other, etc. There is a reason why most people can't get the 100 win streak. The AI get so smart in the battles it's insane.
We already have the tech to have superior AI. However game devs don't put that much resources into smarter AI. They simply make them passable and focus on something else.
Wow, they manage to make somewhat smart AI in a one on one turn based strategy game? Tell me more about how advanced that is. That's not particularly impressive. It's not that much different than programming a computer to play chess, something that's taught in pretty much any more advanced programming course.

What I'm talking about is AI that react in real time, and not just one, but groups of them. Imagine a shooter where instead of every level just being a shooting gallery where you kill hundreds of faceless mooks who run at you with no regard for their safety you instead have to face hundreds of enemy soldiers who are smart, who adapt different strategies flank you as a group, react to the different weapons you're using, anticipate how you're going to use cover, make use of covering fire, etc. Current generation hardware tends to chug when you have more than 15 or 20 characters fighting on screen at once, now imagine increasing that number by a factor of 10, and having smart, independent AI for each one. It has the potential to be amazing.
Again, that is perfectly capable with today's technology. However devs do not put that much resource into AI.
If they can develop AI where they can find a flaw in the move you did in turn based strategy (which is not any less a form of AI intelligence. Although I appreciate the snarky attitude.) then they can certainly do so with 20 soldiers reacting real time. It is all a matter of how much of a priority that is to game devs in the first place, and how much money they will have to cut from other branches of the game to make such a thing possible.
 

Dragonbums

Indulge in it's whiffy sensation
May 9, 2013
3,307
0
0
WouldYouKindly said:
See, now if consoles had been working with standard PC architecture for the past generation, I think you'd have seen a fairly noticeable leap in graphical quality.
Dead Century said:
Diminishing returns anyway. Graphics should be the last thing to focus on when it comes to game design. I'm more interested in what kind of new open worlds or AI can be created on next-gen hardware.
If consoles worked on PC level standards.(Which PC doesn't have a set standard since so many people have computers that vary in age and capabilities)
Then consoles would be too expensive for people to buy, or have zero appeal because people would simply buy a computer.
 

Dirty Hipsters

This is how we praise the sun!
Legacy
Feb 7, 2011
8,802
3,383
118
Country
'Merica
Gender
3 children in a trench coat
Dragonbums said:
Dirty Hipsters said:
Dragonbums said:
Dirty Hipsters said:
Dragonbums said:
I would say you are half right in this.
While yes, later in the generation game graphics will get better eventually.
However the stark differences will be minimal.
I mean, when I saw the first Assassins Creed game (without knowing what it was at the time) I was under the impression that my friend was watching a movie. That was on the Xbox 360. What more can you possibly get out of graphical fidelity at this point? There will come a point where that will become a moot point in people buying a console and they now look for other things. And honestly saying smarter AI isn't really going to cut it for a lot of people who don't really care all that much, and only applies to the games that actually benefit from having intelligent AI in the first place.
While assassin's creed one was a great looking game, there is no way I can believe that you thought it was a movie, not unless you have really poor eyesight. If you get in close to pretty much any texture in Assassin's Creed it looks pretty bad, just like textures in any big open world game.

Also, exactly what kind of game wouldn't be improved by having better AI? Shooters? Improved by better AI. Stealth games? Improved by better AI. Survival horror games? Improved by better AI. Sports games? Improved by better AI.

The only games that wouldn't be improved by having smarter AI are games where there are no enemies.
It's not like I shoved my face into the screen. I just happened to pass by at the time and look at the screen before heading off with my friends. Not before saying "what kind of movie is that" where he promptly told me it was a videogame.

On that note, I feel that many AI are pretty good at what they are now. Unless the devs just didn't give as much of a crap for AI smarts-even today I can find games where the AI get smarter the higher the difficulty is.

I mean when I play the Battle Tower in Pokemon, the more win streaks I get, the more smarter the AI become. To the point where you actually have to know what your up against, which Pokemon to send out, how good are those Pokemon's stats compared to the other, etc. There is a reason why most people can't get the 100 win streak. The AI get so smart in the battles it's insane.
We already have the tech to have superior AI. However game devs don't put that much resources into smarter AI. They simply make them passable and focus on something else.
Wow, they manage to make somewhat smart AI in a one on one turn based strategy game? Tell me more about how advanced that is. That's not particularly impressive. It's not that much different than programming a computer to play chess, something that's taught in pretty much any more advanced programming course.

What I'm talking about is AI that react in real time, and not just one, but groups of them. Imagine a shooter where instead of every level just being a shooting gallery where you kill hundreds of faceless mooks who run at you with no regard for their safety you instead have to face hundreds of enemy soldiers who are smart, who adapt different strategies flank you as a group, react to the different weapons you're using, anticipate how you're going to use cover, make use of covering fire, etc. Current generation hardware tends to chug when you have more than 15 or 20 characters fighting on screen at once, now imagine increasing that number by a factor of 10, and having smart, independent AI for each one. It has the potential to be amazing.
Again, that is perfectly capable with today's technology. However devs do not put that much resource into AI.
If they can develop AI where they can find a flaw in the move you did in turn based strategy (which is not any less a form of AI intelligence. Although I appreciate the snarky attitude.) then they can certainly do so with 20 soldiers reacting real time. It is all a matter of how much of a priority that is to game devs in the first place, and how much money they will have to cut from other branches of the game to make such a thing possible.
Really? You think what I said is possible with 512 mb of ram? Maybe you should be programming games then since you can apparently make miracles happen with badly underpowered hardware.

The current gen consoles can barely even render 200 AI on screen, much less 200 intelligent AI that can work independently of each other.
 

Phrozenflame500

New member
Dec 26, 2012
1,080
0
0
To be frank I'd prefer if next-gen game's graphics were unimpressive, and the devs used the extra power to render larger levels and have better AIs.

I do agree though that the launch titles aren't indicative of the entire upcoming generation though and we should wait a bit before we could make a decisive judgement.
 

Something Amyss

Aswyng and Amyss
Dec 3, 2008
24,759
0
0
Dragonbums said:
I would say you are half right in this.
While yes, later in the generation game graphics will get better eventually.
However the stark differences will be minimal.
I mean, when I saw the first Assassins Creed game (without knowing what it was at the time) I was under the impression that my friend was watching a movie. That was on the Xbox 360. What more can you possibly get out of graphical fidelity at this point? There will come a point where that will become a moot point in people buying a console and they now look for other things. And honestly saying smarter AI isn't really going to cut it for a lot of people who don't really care all that much, and only applies to the games that actually benefit from having intelligent AI in the first place.
I can't wait until you need a 40' (not 40") screen to tell the difference between graphics settings.

PC gamers will still insist they can tell and complain if a game only has 200XMSAA instead of 202X.

Dirty Hipsters said:
While assassin's creed one was a great looking game, there is no way I can believe that you thought it was a movie, not unless you have really poor eyesight. If you get in close to pretty much any texture in Assassin's Creed it looks pretty bad, just like textures in any big open world game.
You know, when they first started showing these things called "moving pictures," an image of a train coming at the screen was enough to freak people out because they thought it was going to come off the screen. If these primitive move-ies could impact people, I can't believe you are so jaded as to believe Assassins Creed couldn't. Especially when you consider the "textures" aren't necessarily good in CG films, either. Hell, there was a time where people were blown away by Christopher Reeve looking like he was "really flying." If I remember right, the same was true of George Reeves back when he wore his underwear outside of his tights, too.

That's a loooooooooooooooooooot of people with bad vision.

Or, alternatively, it's not about vision but perception.

Would we even need better graphics at this point if we weren't told so? Would people be disappointed if the expectation hadn't been set?
 

Amir Kondori

New member
Apr 11, 2013
932
0
0
While the games will certainly look better as time goes on there is a concept called diminishing returns that kicks in heavily with things like polygon based graphics.

Basically they better they get the more polygons you require to have a noticeable improvement in image quality.

If you design a 3d face using 1000 polygons and then get the power to design that same face using 2000 polygons you'll get a very large difference in quality of those two faces.

On the other hand if you are rending a face using 100,000 polygons and double that to a face with 200,000 polygons the effect will be much less pronounced. This is because the face made with 100,000 polygons already looked pretty good.

The second thing that compounds this issue is that fidelity costs money. Game development costs and developer team sizes haven't risen for nothing. They have risen so much because it takes a lot of artists a lot of time to make such high quality assets and to animate them, etc.

So I think we definitely will see a lot of people perhaps upset that the graphics are not a huge leap forward like previous console generations were. I think we do have some things to look forward to though.

1. Dynamic lighting is a big part of how realistic a scene looks. People often don't understand just how much verisimilitude accurate, real time lighting adds until they see a game that implements it well. Real time shades, light sources that can move, multiple reflections.

2. More physics simulation. With the increased power of the new consoles I think we will see more games doing interesting things with fluid simulation and deformable terrain and other things we haven't thought of yet.

There is also progress being made on decreasing the costs of development and making better development tools, so that may perhaps make it easier for developers to up the fidelity of their games. Still, no one should expect a huge leap in graphical fidelity, especially in the first one or two years of these new consoles.
 

Dragonbums

Indulge in it's whiffy sensation
May 9, 2013
3,307
0
0
Dirty Hipsters said:
Dragonbums said:
Dirty Hipsters said:
Dragonbums said:
Dirty Hipsters said:
Dragonbums said:
I would say you are half right in this.
While yes, later in the generation game graphics will get better eventually.
However the stark differences will be minimal.
I mean, when I saw the first Assassins Creed game (without knowing what it was at the time) I was under the impression that my friend was watching a movie. That was on the Xbox 360. What more can you possibly get out of graphical fidelity at this point? There will come a point where that will become a moot point in people buying a console and they now look for other things. And honestly saying smarter AI isn't really going to cut it for a lot of people who don't really care all that much, and only applies to the games that actually benefit from having intelligent AI in the first place.
While assassin's creed one was a great looking game, there is no way I can believe that you thought it was a movie, not unless you have really poor eyesight. If you get in close to pretty much any texture in Assassin's Creed it looks pretty bad, just like textures in any big open world game.

Also, exactly what kind of game wouldn't be improved by having better AI? Shooters? Improved by better AI. Stealth games? Improved by better AI. Survival horror games? Improved by better AI. Sports games? Improved by better AI.

The only games that wouldn't be improved by having smarter AI are games where there are no enemies.
It's not like I shoved my face into the screen. I just happened to pass by at the time and look at the screen before heading off with my friends. Not before saying "what kind of movie is that" where he promptly told me it was a videogame.

On that note, I feel that many AI are pretty good at what they are now. Unless the devs just didn't give as much of a crap for AI smarts-even today I can find games where the AI get smarter the higher the difficulty is.

I mean when I play the Battle Tower in Pokemon, the more win streaks I get, the more smarter the AI become. To the point where you actually have to know what your up against, which Pokemon to send out, how good are those Pokemon's stats compared to the other, etc. There is a reason why most people can't get the 100 win streak. The AI get so smart in the battles it's insane.
We already have the tech to have superior AI. However game devs don't put that much resources into smarter AI. They simply make them passable and focus on something else.
Wow, they manage to make somewhat smart AI in a one on one turn based strategy game? Tell me more about how advanced that is. That's not particularly impressive. It's not that much different than programming a computer to play chess, something that's taught in pretty much any more advanced programming course.

What I'm talking about is AI that react in real time, and not just one, but groups of them. Imagine a shooter where instead of every level just being a shooting gallery where you kill hundreds of faceless mooks who run at you with no regard for their safety you instead have to face hundreds of enemy soldiers who are smart, who adapt different strategies flank you as a group, react to the different weapons you're using, anticipate how you're going to use cover, make use of covering fire, etc. Current generation hardware tends to chug when you have more than 15 or 20 characters fighting on screen at once, now imagine increasing that number by a factor of 10, and having smart, independent AI for each one. It has the potential to be amazing.
Again, that is perfectly capable with today's technology. However devs do not put that much resource into AI.
If they can develop AI where they can find a flaw in the move you did in turn based strategy (which is not any less a form of AI intelligence. Although I appreciate the snarky attitude.) then they can certainly do so with 20 soldiers reacting real time. It is all a matter of how much of a priority that is to game devs in the first place, and how much money they will have to cut from other branches of the game to make such a thing possible.
Really? You think what I said is possible with 512 mb of ram? Maybe you should be programming games then since you can apparently make miracles happen with badly underpowered hardware.

The current gen consoles can barely even render 200 AI on screen, much less 200 intelligent AI that can work independently of each other.
So when someone disagrees with you you go the route of "why don't you make your own games" tactic?
I guess you missed the part where I said that in order for them to achieve what you are looking for, they would have to take stuff out of other aspects of the game.
Also, how do you know precisely- that the horribly limiting hardware is preventing them from making AI?
How do you know that devs simply do not care all that much about smart AI and would rather put more resources into other things.
You getting all fired up because I'm not agreeing with you on the aspect of games being held back because of AI intelligence is ridiculous.
I politely gave you an alternative to the AI situation- simply saying that they can accomplish that at the trade off of sacrificing other things and you take it upon yourself to act unnecessarily snarky towards me and now telling me to "make my own games".
It's not like I said it's easy work.
 

Headdrivehardscrew

New member
Aug 22, 2011
1,660
0
0
I think it's important to acknowledge that with the upcoming, newest generation of 'console' hardware, we'll finally get proper HD - the same HD we got promised with the current generation, but only got, what, 480p or 576p blown up to 720p.

The specs we are promised now should be enough to solve the tunnel vision conundrum. They should end the texture pop-up, the trees-pop-up and the abysmally crap level design that has haunted many a game, crap or swell. We will have enough RAM to do things properly, not just pretend while doing some coding trickery while essentially spinning in a circle on the spot.

Since both contenders have dropped the PowerPC architecture as 'DRM'/deterrent, I would expect development to get boosted, if it isn't already.

The 256/256 or 512 MB RAM bottleneck was a nasty one. I'm glad it will all be over soon. Don't get me wrong, there are plenty games I love on console. But all the more complicated titles (and designers, dev teams, coders, players etc.) are suffering because of the anemic resources. If the new, seemingly over-abundant resources (which are just state-of-the-art, really) are handled correctly, we should see no more, or at least absolutely minimal pop up of anything and level design should be much more fun, let alone the resulting ride. People that say a lot more is possible with the 512 MB RAM total either have no clue or are plain delusional.