The_Kodu said:
Except they need to get that money back somehow. At present they're selling hardware at a fair loss per unit so they have high R&D costs and then don't make money off the hardware.
I know it's a cliche statement by "They're a business, they have to make money somehow"
Of course they do. BUt that cost was created before the console launch (before its even mass manufactured actually) and thus would not show up on this years financial accounts, meaning that the 400 million loss is not including research and developement. It cant cover these costs back if it cant cover its operating costs to begin with.
Yes, business need to make money somehow, however some business fail to make money, and if they fail long enough they go bancrupt. In fact, over 90% of new business go bancrupt within a year. this is normal in capitalist society. Its just that we got used to "too big to fail" idea that we think something like Microsoft is unfallable.
That's kind of the thing that custom APU has to be developed and the arrangement of the parts has to be made.
Yes, the APU is costum and we know MS paid 1 billion for its design (supposedly split costs with AMD, rumor has it MS paid the whole price because AMD would not take unprofitable deals, the profit is so low as it is that Nvidia flat out refused to work for so little).
The manufacturing and assembly though isnt that hard afterwards. APU isnt something unique (Like PS3 cell processor) and was in fact used in mobile devices for a while. The rest is standard x86 architecture so nothing unique or costly about that.
And Windows 8 eats 2GB of Ram easily itself and most likely unlike PCs part of the reserved OS HD space is actually virtual Ram essentially.
I cant claim about windows 8 so much, but windows 7 uses less than 1 GB of ram and in fact will lower its consumtion if you run out of space. you CAN game modern games on a 1GB system with windows 7, and while not ideally, they are playable. Actually real life tests show that beying 4 gb there really is no noticable improvement even in RAM demanding games like Skyrim or Crysis.
Page file virtual ram exists, however this has pretty much gone into disuse as the RAM amount on PCs became enough to fit stuff in it wholly. The Xbox Hard drive reservation is not page file though, its for OS and OS "updates" essentially stuffing space before its needed.
SO in this case consoles are not better but in fact worse at giving all resources to the game.
Consoles Run games and are designed that way. Because of the more fixed architecture developers can optimize their games far better for the systems.
the x86 architecture that the new consoles use were used in PCs for DECADES. and while there is alternative architecture design, its really only used for servers or the kind of build it yourself enthusiasts and most people never even saw one.
It has nothing to do with optimization other than any optimization done for console based on architecture now would be automatically applicable on PC as well. What you are more likely refering to is fixed power, which means one can develop knowing exaclty how much processing power the computer will have, ect. this is good when you need to make limits for the game, however is unnecessary as coding is done to high level APIs (think directx, openGL) now, and they dont change depending on power and simple graphic settings will solve the scale problem.
To use the Xbox 1 as an example it has a 4 core with 1.75 GHz power yet it's running games better than than gaming computers with those kind of specs because it can dedicate the power to gaming and is designed for gaming.
Xbox 1 has an underpowered mobile CPU with 16 cores. 4 of these cores are reserved for OS, the game can use 12 cores. Now, the cores are virtual, and like you said there are only 4 physical cores, but same is true for modern CPUs in PCs.
A gaming computer with such processor would run the game at same, even better level. BUt its really irrelevant as CPUs are only secondary in performance after GPUs. You really dont need a powerful CPU to run games.
As far as GPUs go, the Xbox 1 seems to be underpoerforming. In thoery it has specs similar to high end GPU, but in practice we see it performing worse even to the cheaper model of same generation. Speculation exists that this handicap is introduced by OS eating performance power and/or kinect.
SO no, consoles in fact perform worse than same spec PCs.
Can a windows 95 computer play games better than an Xbox one ?
then I think the reason is pretty clear games consoles are designed to play games.
It's only more recent high end tablets which have actually been able to play game that are close to maybe early last gen and they're what $1,000 for them now.
If hardware manufacturers made drivers that work on windows 95 we could test that, but since windows 95 is not used by pretty much anyone no point in making drivers, so we cant test it. we can test it on modern windows though, and it does play it better.
Game consoles
should be designed to play games. as it is, modern ones
are not.
Im talking about GPU power, not the game choices on tablets. Tegra K1 and its sucessors outperform these previuos gen consoles grpahically. the reason you will hardly see better graphics though is resolution. these devices run 1080p almost exclusively - something
current consoles fail at.
Shadow-Phoenix said:
I mean it's like those who think linux is the future and should be the only OS, not everyone wants that and sod off isn't an excuse, the market exists because people want something different.
Thats ironic, considering Linux is actually hundreds (lets ignore obscure forks because honestly who uses them) of OSes that all manage to work with same files/programs. there are linuxes that look identical to windows, there are ones you wont even recognize. it can never be the "only" OS as linux isnt one OS to begin with.