Ok, so your examples of games proving what consoles can do are;
-Skyrim, the game that didn't exactly work on the PS3 as the PS3 had too little RAM to effectively run it after a set number of hours. The PS3s hardware left many unable to play the game, and Bethesda had to do serious optimization work to try and let them do so. In addition, Skyrim was rather meh. 1-3 enemies on screen at a time, terrible textures, enemy AI is idiotic, low level of detail with things such as clutter and vegetation, loading screens everywhere, and a near completely static world with no consequences for any of your actions. Yeah... Gee consoles can do so much. They can't even run a mediocre game.
-GTA V. The game that required a hard drive install and a disk in the tray to run properly. Screws over digital distribution, and those who don't have the hard drive space to install it. Haven't properly played the game myself as its not out for PC yet, but I wouldn't be surprised if it were similar to Skyrim with a better written main storyline and a couple of shader effects on top. Again, this simply showcases the limitations of consoles, requiring dual streaming to play new games made exclusively for consoles.
-Crysis 3. Umm... What? If graphics is your argument than you should look at PC vs Console comparisons, hell, Crysis 1 PC vs Crysis 3 console comparisons. The graphics on current gen consoles are actually quite poor, and the amount of depth that was in Crysis 1 that was sacrificed in 2 and 3 in order to get them to run better on consoles... Yeah, no showing me that consoles are useful. I guess you could argue some of its enemy AI was better than average, though that doesn't really say much, but otherwise... Why is this an example? The Crysis series has gone from the 'But can it run Crysis?' pride of PC gaming to something that I often here criticized for being 'consolified'. Simplified and linearised for the sake of consoles. That's hardly supportive of your 'We don't need a new generation' argument.
Simply put, the current generation is out of date. Better hardware will yield better results in a number of ways;
RAM:
Perhaps the most obvious area for improvement. Current consoles have a grand total of 512Mb of RAM. 256 of this is for graphics, 256 of this is for system. This means that all the textures in the game have to load on 256Mb of RAM. That is maybe 20 individual textures of a high quality - that means sky, each different terrain texture [Beach counts as different to pavement], each building's unique texture, useable door's unique texture, one texture for every unique looking character, if there is equipment a texture for that, water texture, cloud texture, rock textures for each different type of rock, tree textures for each different type of tree - the number of textures a game needs is phenomenal, and greatly surpasses the 20 limit imposed by using high quality textures on 256Mb RAM. And textures aren't the only thing that RAM is needed for either, meshes for models, shader information - anything to do with graphics can be delegated to the GPUs RAM. Have you ever found it annoying that there's only 5 types of tree in a whole world? 5 types of Rock? 5 types of building? Textures pop in after 10 seconds or so with 1995 era textures? Some games work around this by working with horrible low quality textures, of which they can use more without stressing the system, but even they are limited. If you ever find it annoying that half the objects in a game seem to be cloned from other objects, this is the reason it is such.
Beyond the graphics side of things, however, comes more gameplay. Gameplay these days: Linear corridors. Small levels. Up to 5, maybe 8 enemies on screen at a time. Non-destructible environment. Despawning and respawning items, vehicles and equipment. Whilst some of these are also gameplay decisions, they are all limitations of current console hardware. FPS games these days are linear corridor shoot fests as the 256Mb system memory isn't enough to load anything else. Sandbox worlds are filled with loading screens as, were they not, the game wouldn't be able to handle the transition from the overworld to a city or dungeon without crashing on a console. The limited ram means the system can't remember several entities if there are too many on screen. Item positions can't be permanently saved when dropped as keeping track of that in the memory takes it away from other things like characters and the world itself. Environments can't be destroyed as consoles would have trouble remembering which parts had and hadn't been destroyed, and in what way. All the "Shooters are all the same" stuff that is said these days is a symptom of console's being outdated. This has leaked across to PC games now, where titles that were originally great on the PC - like Battlefield - have been simplified down to an empty, open map, and a few hotspots of conflict for the sake of consoles. Proper terrain destruction was removed from BF3 at first because it was highly exploitable in the Beta, however once they had fixed that they decided not to re-implement it as it was taxing on console systems. Basically, level design and gameplay mechanics have to take a hit as consoles can't handle large maps with large numbers of enemies.
RAM also allows for multitasking so whilst you're playing Skyrim, your console is installing the latest GTA V update. With low RAM you can't do this without a large drop in performance in whatever game you're playing. Of course, such drops are ubiquitous in console gaming these days, and a lot of people have gotten used to it. Just because you expect games to lag now doesn't mean they have to.
CPU:
Another area of improvement. The CPU will allow for larger maps, more enemies on screen, better graphics, realistic physics simulations, destructible environments, better AI, greater persistence, increased performance, better multitasking - quite literally everything can be improved with a CPU upgrade. A CPU upgrade on its own, however, will result in bottlenecks elsewhere. Not upgrading the CPU results in a bottleneck at the CPU. The CPU has to process everything that happens in game - on screen or no. The more that happens at one time, the more stressed the CPU is. This is another reason consoles are normally stuck with 4-10 enemies on screen at once. The CPU can't think for any more of them at the same time, whilst its processing player inputs and turning them into actions, calculating persistence of items, receiving information to send to the GPU to draw, calculating the physics with which the player should drop after having jumped, how that car handles, the amount of damage that gun has - literally everything going on in game. Why are weapons in console games simple hitscan weapons that deal the same damage at any range a lot of the time? Because calculating a projectile's path, and calculating its reduced damage based on even a simple factor like distance, is taxing for console CPUs. Some games do both, some do one or another, a lot do neither.
Basically, if you want ANYTHING to improve in a console game, you need to upgrade the CPU.
GPU:
What it says on the box. Better graphics performance, less strain on CPU and RAM as the GPU takes near the entirety of graphics processing onto itself, rather than delegating some tasks to the rest of the system as it isn't strong enough to.
Optimisation:
Presently it costs a lot of money and time to optimise games for consoles. They use strange architectures - especially the PS3 - and have very low power hardware, and thus devs have to spend an inordinate amount of time optimising games for this. Making it take complete use of the PS3s CELL CPU with a different sub-CPU for each different type of calculation, ensuring the appropriate calculations are sent to the appropriate one of six cells for processing. Reducing texture size so the console's aren't strained. Increasing the size of things like guns on screen so that the console won't have to draw what's behind them, meaning a gun that takes up 1/3rd of your screen is favourable as its easier for the CPU to draw. And even with this, it doesn't always work out. Case in point; Skyrim on the PS3.
All next gen consoles, however, will utilise PC X86 architecture. This makes porting from PC to console and vice versa a lot easier. It also means devs don't have to waste time programming for several different architectures for a multi-platform release. X86 is also a well known architecture that has been around for a LONG time. It been around since 1978 from memory. It should be very familiar in most devs heads as to how it works, seeing as a lot of courses for programming likely focused around it and the PC.
Additionally, the new consoles have more power than Devs know what to do with. This means they don't have to find ways to decrease the amount of resources the game uses by exorbitant amounts, and can spend the time they'd normally spend on that finding and fixing bugs and glitches, or expanding the game.
The new architecture, whilst meaningless for us, will make dev's lives a lot easier, likely result in decreased development time and budget requirements, and easier multiplatform and ports.