viranimus said:
Joccaren said:
Nice failed attempt at a subversive insult. It might have worked better had the logic behind it not been so flawed or perhaps if the video had been embedded, but I went ahead and embedded it for you.
Ahh, that's how you embed vids on the Escapist. I'm used to other sites where I have to put in frame width or some other crap I have no clue what its asking for.
Anyway, its more of a 'This is why your wrong' comment than anything, its just a good excuse to use Star Wars and videos.
LOL, no its not. (Just using intel as an example)In 2007 150$ for a processor would have likely netter you a 2.5 dual core processor. Five years later (not even accounting for inflation, though 20$ is not going to net you any increase in speed or cores) in 2012 that same 150$ MIGHT land you a 3.5 dual core. Using Moores law is flawed because it is the application of theoretical principles in an environment of practical application and practical application always mutilates theory. 1ghz processing speed per average available CPU over the course of a 5 year span is NOT proving Moores law. So in 5 years, that equates into 3.3 revolutions. So that would mean we would see today 10.0 ghz processors, or 16 cores, Or some sort of combination in between like 5.0ghz processors in octocore configurations. Moores law is nothing more than an over simplization to make cumbersome metrics more digestible. Really it should be called Moores Suggestion because at least that would be more accurate.
1. $150 got me a 3.5Ghz Quad Core 8 thread that is meant for overclocking, and I have overclocked to 4.8Ghz. That is for a commercial Mid Range $150 CPU, and not what the Research and Dev teams are coming up with in the technological centres of the world. Also note that the change in power with multi-cored processors is different to a single thread processor. Double the transistors on a single thread processor, you get about a 40% increase in power. Multi-Core, you get about a 20% increase.
2. You are comparing comparable price hardware to technology as a whole. When you say technology is not increasing exponentially, you are wrong. The best systems we have still grow at an exponential rate in power every 18 months, but these things aren't your average every day computer. Add into addition that technology as a whole is not merely CPU, but also GPUs, Hard disk size and speed, RAM size and speed, motherboard tech - tech on the whole.
I chose oblivion to skyrim because it represented a game series that came out at the begining of the the 360 life, vs one that came out last year that CAN represent the level of NEED for system resources. By comparing the "essentially" Windows exclusive Battlefield 2 vs a console optimized Battlefield 3 is to Oblivion vs Skyrim is literally comparing apples and oranges.
It compares how well Devs can adapt to limited hardware, not how technology is advancing.
On a PC, Devs are given far higher limits, and can get more done. See BF2 to BF3.
Also, define 'need'. Back at the PS1 we didn't 'need' any better systems, they were powerful enough as was. Tell me a PS1 game compares to a PS3 game in overall quality. Less bugs? Yeah, that comes with updates being handed out over PSN and such. Other than that, the games are alround lesser than PS3 games, dependent on the gameplay styles you like and whether you preferred the old styles or the new ones or don't care.
Whilst you may think ATM that this is the best graphics you could ever want, and PC player will tell you they look horrid. Low Texture resolution, few or no shader effects, short view distances, small FoVs, bad Framerates - the list goes on.
Your correct, system specs do not tell you what the hardware can do. They tell you what the developers were able to push out on that equivalent level of hardware. And it does not matter if you have a 16 core 10.0ghz on every core if your running a game that recommends 3.0 quad core and its a PC exclusive release, your still only going to utilize that 3.0 quad core level properly. Again this is a case of hypothetical theory vs practical application and in every single case practical application always wins. There is no imaginary pissing contest to win here.
But it does not tell us that technology doesn't advance, only that Devs don't utilise it currently. Many titles have this problem because of consoles - having to Optimise it to run on consoles, and putting more time and money into that that could be spent making the game better if it were on good hardware. PC exclusives often don't utilise a lot of it as they can't leave out the lower end PC gamers who use PCs that are 8 years old. I guarantee you though, if consoles start offering equal performance to them, they'll lift their standards.
As for the rest I do appreciate you illustrating my point perfectly. I mean I am still stunned that anyone could suggest that Online competitive multiplayer is in a Brilliantly fulfilling state with a straight face. (honestly that made my day cause its always good to start out a day with a hysterically boisterous laugh)
Online Competitive multiplayer is enjoyed by the most people of any entertainment in the world. Argue against the facts if you will. You could go with your personal bias and claim that its bad, or you could go with evidence and acknowledge that the devs are apparently doing something right. I also have yet to see a viable 'solution' to the genre. What would you suggest they do? I guarantee it is a bad idea on the whole, as people will leave the game due to not liking it any more. When you have a formula people love, you don't change it. Hell, sometimes if people don't love it you don't change it - look at what happened to Coke when they changed their formula. When you have something like what is currently in Competitive online multiplayer, you don't change it. Look at DA:O to DA2. Imagine that on a larger scale, a billion people scale. Not what you want. What you do is make parts of it better. If that happens to be the graphics, so be it. Battlefield 3 looks stunning, and you do notice it.
But you are still illustrating how your opinion is skewn to favor the logic behind the FPS justifications by using FPS logic such as what boils down to "Yes.. we need a console generation because I am not satisfied with the equivalent of 2xAA on the bush I died next to that I will look at for all of 10 seconds, when that bush should be at least in 32xAA" I didnt say FPS should look bad.. I said if your concerned about how "pretty" your field of death is,(beyond a certain point) then you have serious prioritization issues and theres no amount of bleeding edge hardware system spec sheets that will ever satisfy that issue.
And this is why I hate people like you. You will over exaggerate anything. 32*AA is not needed. What is needed is better shader effects, higher resolution textures, more detailed models - that sort of stuff.
To follow your trend of exaggeration, I assume we should go back to the 8 bit era in your opinion? It was enough to communicate everything you needed to know in a game. Sure it looked like crap, but graphics are the devil and we wouldn't need even a PS1 to play those sorts of games, so much cheaper for everyone.
BTW, I have 16*AA on the bush I died next to. There ain't a lot of difference with AA on a low setting or a high one. Its just fun to turn it up when I can.
What we need new graphics with the new console generation for isn't for that bush, its for all games. Games like Skyrim, where there was so much potential, but the game honestly looked pretty shit by default. That could have been fixed with better graphics. More detailed textures, something like the FXAA shader injector, more detailed models, longer and more detailed view distances - the works. Skyrim tweaked looks 10X better than Skyrim default, and its not all that much hard work. A lot of the time you just need a reasonable GPU and some decent amount of RAM to run the 32 texture packs you install.
We need it for wider FoV and longer view distances. Things that affect gameplay too.
In all honesty, if you want to stay behind and not advance games with better hardware, stay on the Xbox 360 and PS3. Don't buy their successor. There are people that want things to move forward though, so let us have that, and deal with the lower quality games you get.
TL;DR
Some people will never be satisfied, no matter how reasonable or outlandish their desires may be.
And there is something wrong with wanting more? I'm sure you'd like more pay, or a better car, or a better house. There will be something you want to be better - I guarantee it. This is no different.
It is normal and good to want better. To want worse is stupid. To always want better is to always look to the future and push things forward. To want worse is to look back with nostalgia and try to throw things back.
As I say to everyone - those people who wanted BF3 to be BF2 reskinned, the people who don't want a new console generation, the people who think old games are better than new ones - Go play BF2/on your last gen console/your old games, and ignore the new developments. If conversely you want to play these new games and systems, don't complain that they were better earlier as apparently they weren't. No-one is forcing you to buy the next console generation. There is no gun to your head. If you don't want it, don't buy it. Will you get left behind? Yes. But IMO that's better than dragging people back because you don't want to go forward.