I'd love for you to show me where Microsoft proclaimed it as a "mystical cure-all". Microsoft has said all along that it would be useful for latency-insensitive computation only.vallorn said:Welcome back UnnDunn. Now onto the real reply, Sony are trying to make it work too. The only difference is that Microsoft proclaim it as a mystical cure-all while Sony are giving us honest facts about what it can and cannot do.
i just think of this every time I read a Microsoft article about their Xbox One.Frostbite3789 said:Anytime I hear the phrase ~*the cloud*~ during a press conference or anything, my brain immediately replaces it with "GODDAMN WIZARDS" because it's essentially the same.
The dwarf fortress game is a bad example. Let's say you attack a giant boar (I haven't play must Dwarf Fortress). The boar would either react slowly (because while the game is waiting on the AI to be processed by the server) or react like any other video game enemy and then act very intelligent until the server processed data was no longer relevant. The speed of internal interconnect like PCI 2.0 is 8 GIGABYTES/second. The speed between the RAM is CPU (on 1600Mhz GDDR 3) is 16 GB/s. Then internal data transferred around inside the CPU is even faster. Their is simply to much data to send to a server (it would at the very least be several megabytes) to be practical in anything other then the slowest grand strategy game (which no one would pay for a server for anyway).Scars Unseen said:I would say that depends on the type of game. FPS? Not a good idea. Turn based strategy? Much more plausible, though one wonders why you would need to. I think my Oblivion example is a good happy middle for this sort of thing. Don't use the cloud to handle the stuff happening on screen, but rather to calculate AI and simulation stuff that you can't see, yet can see the effects of.shameduser said:The latency from console to server is way to long be of any use to the local hardware. In the time it takes the console to send the data it needs processed to the server, have the server process it and send it back the console could have done it much, much faster and way more reliably. Also you would need enormous speed and bandwidth. The internal connections between the CPU, GPU and RAM all measure in gigabytes per second where as internet speed it measured in megabits per second. The two units are several orders of magnitude a part. Cloud computing to offload stuff like graphics, AI or physics is so impractical it makes no sense to even consider it.
Imagine a Dwarf Fortress game that had modern graphics(blasphemy, yes I know) without having to sacrifice its exhaustive detail in simulation. Some very few might have systems beefy enough to handle that, but most PCs(and all consoles if you were to somehow port the game) would need some of the work handled off client. Again, that wouldn't have to mean the cloud; a networked Raspberry Pi would likely be enough to augment the processing load. But off client processing could have a place in the gaming world.
Just not the place Microsoft wants you to think it does.
No... You'd need roughly the same amount of data that a multiplayer client needs to be able to render a player's point of view for most AI - (in some cases, AI and human player code is interchangeable). That's a few hundred kilobits a second at most if the data stream is well designed.nathan-dts said:Too slow. You have AI partially thrown into the cloud then you're going to be uploading all of the variables and downloading all of the process information. Wouldn't work cohesively; internet is too slow.Abomination said:My best examples of seeing how it would work is when I play on a 32 person server then load up my own 31 bot server of some game.
The 32 person server ran perfectly when I played on it but the 31 bot server running on my own machine ran at about 60% FPS. Clearly AI requires processing power and if that can be off-loaded to another location it can cause the immediate machine to run faster.
I, however, do not like the idea of my computer relying on another computer for a single player game. That's all Cloud computing is, making your single player game a multiplayer game without multiple players. We had a really shitty version of that attempted recently, it was called Sim City.
I wonder how much a company will be able to lie and say ?Cloud Powered? when really next to nothing is cloud powered and the game is just using another form of invasive DRM? on the XBone with your always on Kinect.
Tinfoil hat, sure? but I?ve learned you give these corporations an inch they?ll piss all over it then try and feed it to you.
If you're offloading anything that the player can see to 'the cloud' then those things will always be several tenths of a second behind what the player is actually doing. I can see no way that could be annoying at all. Even on my 20mb/s connection an average ping to a server is 30-50ms, double that, add in processing time and you're north of a tenth of a second even on a faster than average connection. It's a very poor application for cloud computing.subtlefuge said:I don't see any reason why 2-3 years down the line you wouldn't be able to offload some basic graphical tasks like shadows or reflections to cloud computing.
I prefer https://github.com/panicsteve/cloud-to-buttFrostbite3789 said:Anytime I hear the phrase ~*the cloud*~ during a press conference or anything, my brain immediately replaces it with "GODDAMN WIZARDS" because it's essentially the same.
And to have an extra layer of DRM.Kross said:Of course you can do many more things when your resources are flexible enough to dedicate to an individual player on demand, but that's really just saving them effort and money. Which is nice when you want to offer more features.
Ah, okay. I just saw his title and thought he was a senior person. It's good to have him so in touch with the real world.nathan-dts said:-snip-
I was talking in hyperbole for entertainment purposes.UnnDunn said:I'd love for you to show me where Microsoft proclaimed it as a "mystical cure-all". Microsoft has said all along that it would be useful for latency-insensitive computation only.vallorn said:Welcome back UnnDunn. Now onto the real reply, Sony are trying to make it work too. The only difference is that Microsoft proclaim it as a mystical cure-all while Sony are giving us honest facts about what it can and cannot do.
Do you have any proof about this? And that's not quite how cloud computing works UnnDunn. The data to be computed has to be sent through a high speed internet line or you get weird lag in your singleplayer game. And besides it wont make that much of a difference if it can only do "Latency insensitive computation"Sony's "attempt" at cloud computation is nowhere near as seamless or comprehensive as Microsoft's. With Microsoft's solution, developers simply do not have to worry about servers or capacity or even cost. Those things are handled by the platform. All developers have to worry about is how best to use this powerful extra CPU with tons of RAM they have access to.
Again. Do you have proof that Sony hasn't got a cloud infrastructure for devs or are you just pulling things out of your arse? Mark Cerny gave us a pretty comprehensive interview where he highlighted the good and bad points of cloud technology so unlike MS who just talk about the positives constantly (even when they're untrue) Sony is actually treating us like adults.Sony has nothing like that, so of course Mark Cerny is going to downplay it. That said, developers can come up with their own implementations of the technology for use on PS4, but they have to bear the costs of developing and maintaining it, and they'll have to run to a company like Microsoft to get a decent cloud infrastructure, because Sony certainly doesn't have one.
Your forgetting that the Xbox was the last released console of it's generation and was absolutely crushed by it's competition.Bottom line: this is like when Microsoft decided to put an Ethernet port, real-time Dolby Digital and a hard drive in every original Xbox: back then, everyone criticized them for making the box too expensive, saying things like "no-one has broadband" and "give us memory cards". But laying that groundwork in the beginning gave Xbox the ability to do things no-one else could match. The same thing is true today with baked-in cloud processing and built-in NUI.
http://www.develop-online.net/news/44318/Microsoft-Cloud-makes-Xbox-One-four-times-more-powerfulUnnDunn said:]I'd love for you to show me where Microsoft proclaimed it as a "mystical cure-all". Microsoft has said all along that it would be useful for latency-insensitive computation only.
Do you read more than the headline, ever?Infernal Lawyer said:http://www.develop-online.net/news/44318/Microsoft-Cloud-makes-Xbox-One-four-times-more-powerfulUnnDunn said:]I'd love for you to show me where Microsoft proclaimed it as a "mystical cure-all". Microsoft has said all along that it would be useful for latency-insensitive computation only.
Four times as powerful. FOUR FUCKING TIMES MORE POWER WHEN IT'S ONLINE. Microsoft has been claiming that for quite a while now, and last time I checked, saying you multiply your console's power with 'da internets' counts as claiming cloud computing is a 'magic bullet' to gaming.
Read it and comprehend it. Then read it again. And one more time for good measure."We're provisioning for developers for every physical Xbox One we build, we're provisioning the CPU and storage equivalent of three Xbox Ones on the cloud," he said. "We're doing that flat out so that any game developer can assume that there's roughly three times the resources immediately available to their game, so they can build bigger, persistent levels that are more inclusive for players. They can do that out of the gate."
Theoretically, it can be done, but it would require a large amount of bandwidth and throughput*, and even if it were done the resulting improvement would be MINISCULE since most rendering effects provide diminishing returns.Vivi22 said:I assume by some you mean Microsoft because I haven't seen anyone else stupid enough to make that claim and actually expect people to believe it.Cognimancer said:Some have advertised that the Cloud will be able to offload processing power in handling game elements like lighting, physics, and even AI.
Or it could just be a pot shot at Microsoft.PoolCleaningRobot said:Not sure if Mark Cerny is a legitimately cool guy...
... Or he's just been browsing threads to find out what we've been complaining about the most
Regardless, at least he's not telling us bullshit. Streaming and cloud services can give us cool things like cloud saves and streaming games instantly. It'll be useful for things like demoing games because who has the patience to download a game you're going to test for 15 minutes? Microsoft's calculation magic has already been proven physically impossible because of bandwidth
"PCs"* are more likely to kill consoles long before Smart TVs will at the rate things are currently going.Sleekit said:it's a moot point tho as tbth i think Smart TVs (and also potentially "game streaming") are probably gonna kill stand alone consoles in a generation or two...