XBone and cloud computing: why is nobody talking about this?

Recommended Videos

Zeh Don

New member
Jul 27, 2008
486
0
0
Requia said:
The claim that cloud computing won't work for gaming because latency is frankly nonsense. If latency was a barrier to gaming online multiplayer wouldn't work either...
Multiplayer Gaming, such as Counter-Strike or Battlefield, is throwing around a few hundred kilobytes of compressed data. Even then, you're still getting 30-50ms of latency on a good connection.
Your system translates that data into the visual presentation you see on your end. Textures, animations, physics, lighting are all local - the only thing you're pulling from the server are location variables, animation sequence numbers and damage numbers.

Microsoft are talking about increasing the load on the server and spitting out more numbers to your machine. Bandwidth is the limiting factor here; instead of a few hundred kilobytes, we're entering megabyte territory. And so the latency will increase upwards of 30-50ms dramatically and quickly.
Considering most OnLive games are barely playable at 30-50ms - anything twitch related is simply out of the question completely - you're talking about introducing 50+ms into a single player game... to do what?

Better AI? At 50+ms, those calculations are worthless for any realtime game.
Better lighting? The computational gain from a mere four-fold increase for calculated hard-baked static lighting is worthless at the levels of fidelity we're talking about. You'd need an increase upwards of 10 fold to see something worth the effort. Real-time lighting is out of the question entirely.

As demonstrated by SimCity, which actually performs no - NO - server side calculations, there are no benefits of any kind to the end user for distributed processing of this type. Zero.
 

Bad Jim

New member
Nov 1, 2010
1,763
0
0
omegaweopon said:
The Xbone, by default, uses standby mode as its form of shutting down. Thus allowing the processor to run its calculations, and realy, unless you used a voltometer on the thing how would you know?
If I had it in my bedroom and the fans were whirring crazily while I was trying to get to sleep I'd unplug it. And if Microsoft threatened to brick it, I'd let them. Same applies while I'm watching TV.
 

Requia

New member
Apr 4, 2013
703
0
0
Zeh Don said:
Requia said:
The claim that cloud computing won't work for gaming because latency is frankly nonsense. If latency was a barrier to gaming online multiplayer wouldn't work either...
Multiplayer Gaming, such as Counter-Strike or Battlefield, is throwing around a few hundred kilobytes of compressed data. Even then, you're still getting 30-50ms of latency on a good connection.
Your system translates that data into the visual presentation you see on your end. Textures, animations, physics, lighting are all local - the only thing you're pulling from the server are location variables, animation sequence numbers and damage numbers.

Microsoft are talking about increasing the load on the server and spitting out more numbers to your machine. Bandwidth is the limiting factor here; instead of a few hundred kilobytes, we're entering megabyte territory. And so the latency will increase upwards of 30-50ms dramatically and quickly.
Considering most OnLive games are barely playable at 30-50ms - anything twitch related is simply out of the question completely - you're talking about introducing 50+ms into a single player game... to do what?

Better AI? At 50+ms, those calculations are worthless for any realtime game.
Better lighting? The computational gain from a mere four-fold increase for calculated hard-baked static lighting is worthless at the levels of fidelity we're talking about. You'd need an increase upwards of 10 fold to see something worth the effort. Real-time lighting is out of the question entirely.

As demonstrated by SimCity, which actually performs no - NO - server side calculations, there are no benefits of any kind to the end user for distributed processing of this type. Zero.
What the fuck does Sim City have to do with it? There are tens of thousands of games that run calculations server side, or on the machines of players other than yourself.

No, 50 ms lag is no* too slow, because if it was you couldn't play online multiplayer, which has the lag of *both* players taking effect, so average latencies will be about half with server side AI.

Latency has nothing to do with total data transferred.

Minecraft, when played online, does its lighting engine server side, so yes, you can do lighting server side (though I'm not sure why you'd want to, Minecraft only does because the server needs to know the light level for mob behavior).
 

Bad Jim

New member
Nov 1, 2010
1,763
0
0
PoolCleaningRobot said:
Microsoft is completely full of shit. 4 times the processing power? That's ridiculous. They'll never reach that. It doesn't even make sense. Why pay to maintain these servers when they could have made a console with better processing power?
Probably because most consoles aren't used 24/7. They're used maybe a couple of hours a day. Cloud servers will run 24/7. That's a lot more efficient. Also, I think they're planning to phase them in after several years as the Xbone starts showing it's age. In 2020, the rendering power of four XBones will be very cheap.

Of course it's still doing something that does not need an expensive box. The Onlive box is much cheaper than the XBone can possibly be, and you can play Onlive games on a crappy laptop, tablet, phone or smart TV that you probably already own. It's just continuing the running gag of letting you watch TV and various other things that you don't need a new console for. Only this time they're hinting that the actual gaming hardware in the XBone will be redundant.
 

PoolCleaningRobot

New member
Mar 18, 2012
1,237
0
0
Bad Jim said:
PoolCleaningRobot said:
Microsoft is completely full of shit. 4 times the processing power? That's ridiculous. They'll never reach that. It doesn't even make sense. Why pay to maintain these servers when they could have made a console with better processing power?
Probably because most consoles aren't used 24/7. They're used maybe a couple of hours a day. Cloud servers will run 24/7. That's a lot more efficient.
Can you elaborate please? I'm not sure what you're getting at. The electricity to run servers non stop costs more resources that running a single console

[quote/] Also, I think they're planning to phase them in after several years as the Xbone starts showing it's age. In 2020, the rendering power of four XBones will be very cheap.[/quote]

Even if the processing power of 4 xbones becomes cheap, they still can't physically send you that data. The average Internet connection can send about 1 megabyte a second. The fastest one possible sends likes 27 megabytes I believe. The Xbox processor can move around 60 gigs a second. The data you can get over the Internet is nothing.

But I agree with ya. The Xbone isn't doing anything an Apple or Android device or a pc can't do better at a fraction of the cost and size
 

UnnDunn

New member
Aug 15, 2006
237
0
0
Zeh Don said:
Requia said:
What the fuck does Sim City have to do with it? There are tens of thousands of games that run calculations server side, or on the machines of players other than yourself.

No, 50 ms lag is no* too slow, because if it was you couldn't play online multiplayer, which has the lag of *both* players taking effect, so average latencies will be about half with server side AI.

Latency has nothing to do with total data transferred.

Minecraft, when played online, does its lighting engine server side, so yes, you can do lighting server side (though I'm not sure why you'd want to, Minecraft only does because the server needs to know the light level for mob behavior).
Is this a serious post? You're actually meaning this and aren't trolling? I have to be missing an air of sarcasm. I have to be. No one is this mis-informed, uneducated or illiterate. And if they were, they wouldn't be trying to correct people who are well informed, educated and literate about the topics of which they're talking.
No, you must be trolling, or sarcastic. Haha. Good one. You got me :)
I'm sorry, but this post only serves to betray your ignorance, not Requia's. All of his comments are accurate.
 

Atmos Duality

New member
Mar 3, 2010
8,473
0
0
UnnDunn said:
I think you are attributing far too much malice to developers/publishers.
You're free to think what you want.
I think it's naive to implicitly trust these companies with this technology; at least not realize what this technology is capable of.

Consider it from their perspective: What sounds better to those in charge?

"We can increase the visual fidelity of our games!" (costs money to develop that asset)
"We can secure our games entirely against piracy and used game arbitrage!" (costs close to nothing to migrate code; secures game forever)

(Or maybe they're just bluffing about how scared they are of piracy each year.)

Windows Azure is not free; the more developers use Azure, the more they have to pay. Using Azure for some sort of hokey always-on DRM system will cost them a lot of money (in terms of Azure compute time) for minimal benefit considering the console already handles perpetual license verification.
And yet the cost of such servers didn't deter Blizzard, Ubisoft and EA from trying to push their own Always-Online schemes. They have shown that they are perfectly willing to engage in this; why stop now, especially with M$ footing at least part of the bill?

More to the point, I think you're exaggerating the processing footprint here; it's not like they would have to host the entire game, just a very small but critical portion of it (like how SimCity keeps the save function and file on The Cloud; most of the game is run on the "client" machine).

The publisher's Azure compute budget is much better spent on functionality that will improve the game experience, rather than duplicating functionality that is already provided.
And yet Microsoft has made that exact redundant functionality available to the publishers.
Microsoft has left it in their hands: We will see how they use it.

I do not trust them and it's up to them to give me a reason to trust them again.
 

Zeh Don

New member
Jul 27, 2008
486
0
0
UnnDunn said:
I'm sorry, but this post only serves to betray your ignorance, not Requia's. All of his comments are accurate.
I see. Well, follow along please:

Requia said:
What the fuck does Sim City have to do with it?
I mention SimCity because it's the primary example of a major single-player game apparently using "the cloud". And, of course, it didn't actually use "the cloud" at all. As has already been covered extensively, distributed computation limits the practical applications for "cloud" uses in video games to less than the typical client-server architecture already provides.
Maxis and EA, for example, weren't able to derive any benefit from the system and so implemented none - they simply used it as a marketing buzz word in an attempt to cloak their DRM system. That's because, as I've mentioned, there aren't any real benefits that can be used practically.

Synchronisation issues are a major part of distributed computation - the type of computation that "the cloud" creates - as multiple parts of the program as a whole are computed at different times, and slotted together at the end user's machine. Microsoft are attempting to sell this to the mainstream under the magic that it will make your console fours times as powerful. This is a lie. There are simply no applications under which this could be true.

Diablo III, the other primary example, also isn't "cloud based gaming". With Diablo III, your client-side sends small packets of information to the server, such as movement locations, who in turn merely validates it. A hiccup in the internet connection and server disallows an update tick - causing the prominent rubber banding issues. The only thing the server truly handles is loot drop calculations and damage calculations - and this is for the deployment of server side hot-fixes, which to date have been used simply to curb farming techniques that were deemed "too efficient". Some 99% of all calculations are running on your machine, with mere kilobytes of data used from the server.
Even with these pitiful requirements, Blizzard's servers imploded upon release under the strain of the sheer volume of simultaneous connections. And now, with less than 10% of the entire player base still playing, constant rubber banding, lag issues and disconnects are still quite common. This is because of the sync issues I've mentioned above - it's this example is literally using kilobytes. Imagine the issues when dealing with megabytes.
Increasing the speed of your internet connection increases the uses for this technology, however Microsoft have stated the required speed for an "optimal" Xbox One experience is a mere 1.5mbps. That isn't actually fast enough to handle Diablo III, let alone entire frames worth of drawable data.

Requia said:
There are tens of thousands of games that run calculations server side, or on the machines of players other than yourself.
You're mistaking the client/server model for a cloud based model. World of Warcraft is not "cloud based" gaming. Nor is Planetside 2, though it is certainly closer than any other mainstream MMO.

The difference is in what the "cloud" is doing. With MMOs, like World of Warcraft, the player performs an action, and that action is sent to the server. The server validates it, updates the necessary information like enemy HP, and sends it back to all of the players who's machine requests it. The player's computer then calculates animation frames, physics, lighting, particle effects, etc., using a deterministic algorithm and then draws it out onto the screen, less the time generated by connection latency, and the then the player performs another action and the cycle repeats.
The connection latency is hidden by building a degree of latency into the game itself. World of Warcraft used this to great effect. Of course, it is impossible in some games - first person shooters most notably.
The deterministic algorithm ensures that everything is synced up visually. Google "Deterministic Engine" for a better understanding of how this works, as I've only had dealings in passing with these types of calculations. However, I still understand the concept - which is to perform calculations that will be the same result every time. Braid, for example, uses this as apart of it's "Rewind" function (most games do).

With a proper "cloud" game, the cloud isn't just keeping track of your position, adding damage numbers together or trafficking chat messages. It's performing some of the grunt work of the game itself, decreasing the load on the client machine as a result. It's really the key difference.
Things like lighting calculations, animation frames, particle effects, physics; the server is doing the work that the client itself would normally do. As a result, the client machine is freed from performing those calculations and is able to perform additional tasks, granting the program a greater resource pool.

Diablo III and SimCity do not run better on your machine as a result of their "cloud" features. That's, frankly, the biggest sign.

UnnDunn said:
Minecraft, when played online, does its lighting engine server side, so yes, you can do lighting server side (though I'm not sure why you'd want to, Minecraft only does because the server needs to know the light level for mob behavior).
Not quite. I can understand your confusion though, so I'll explain.
Minecraft's server is merely updating the current lighting condition of a tile when a player interacts with it. It's literally a scale system - 1 to 10, I presume - that is kept for each cube within the currently loaded area of the game world. This scale number is merely increased or decreased depending on how many light sources are passed on to it.
When you dig a block, you send a message to server to say "give more light to the blocks I just exposed" and it increases their scale number by one. It doesn't perform any calculations itself, it merely updates those numbers and passes them back to each player when their client machine requests them, which it then stores locally. The amount of information being passed is still quite small as a result, and the server is doing little more than mere integer based additions.

If Minecraft's server was a "cloud based lighting" system, then it would actually be calculating the light gradients for each vertex of each cube itself (well, Minecraft is voxel based I believe, but hopefully you get the idea), and passing that information back to your machine, so all it has to do is draw it out onto the screen without any further calculations of any kind. Of course, the amount of information it would be passing around would be massive, as it would be passing that information back to your machine every single frame. You'd need a fast connection for this to occur imperceptibly - a LAN connection would suffice, but the vast majority of internet connections would fail.

Titanfall, Respawn Entertainment's new game, is said to use the "cloud" for A.I. This is a clever miss-use of the term, as it's really just a client/server model handling bots - like Unreal Tournament, Counter-Strike or even Quake 3.
Microsoft are attempting to rebrand everything "online" or "server based" to merely "the cloud" in order to sell their new machine to the uneducated.
For example, you'll notice that in their press release, they state that your games are stored "on the cloud". They're not. They're stored on Microsoft's currently existing server network. If they were stored "on the cloud" you'd be able to stream them to your console free of charge, exactly like OnLive or Gaikai. But, you can't. You have to download them, and then you can play them.

I understand it's confusing, and I probably should've been nicer about it, so I aplogise for being rude - it really wasn't called for. I just get angry that Microsoft are lying to people, and people are defending those lies.
I hope that clears it up. If not, let me know what points are still confusing and I'll explain in greater detail.
 

Bad Jim

New member
Nov 1, 2010
1,763
0
0
PoolCleaningRobot said:
Bad Jim said:
PoolCleaningRobot said:
Microsoft is completely full of shit. 4 times the processing power? That's ridiculous. They'll never reach that. It doesn't even make sense. Why pay to maintain these servers when they could have made a console with better processing power?
Probably because most consoles aren't used 24/7. They're used maybe a couple of hours a day. Cloud servers will run 24/7. That's a lot more efficient.
Can you elaborate please? I'm not sure what you're getting at. The electricity to run servers non stop costs more resources that running a single console
It's not energy efficient, but it's very efficient in terms of hardware purchased to processing power delivered. To give your console a GTX 690 they would have to charge you the cost of a GTX 690. But if they have a server with a GTX 690, they can serve many users with that GTX 690, since they won't all be logged in at the same time. So they only have to charge you say 1/4 the cost of a GTX 690. And if they have custom computers with say 64 GTX 690s in them, those custom computers will not need 512GB of RAM if they are all running the same 50GB game.

PoolCleaningRobot said:
Even if the processing power of 4 xbones becomes cheap, they still can't physically send you that data. The average Internet connection can send about 1 megabyte a second. The fastest one possible sends likes 27 megabytes I believe. The Xbox processor can move around 60 gigs a second. The data you can get over the Internet is nothing.
Well, the fact is that Onlive works, and that is basically what Microsoft is planning to do. It's not really impostant how much data must be shifted around in order to render an image, what matters is whether the image can delivered to the user. If you have a decent internet connection, that data can be delivered. Otherwise, in the words of Sony's favourite Microsoft employee, "deal with it".
 

UnnDunn

New member
Aug 15, 2006
237
0
0
Zeh Don said:
UnnDunn said:
I'm sorry, but this post only serves to betray your ignorance, not Requia's. All of his comments are accurate.
I see. Well, follow along please:
You're attempting to attempting to describe the Azure feature in terms of current and previous network gaming technology, which is an understandable approach to take, but also inaccurate. Azure does not replace multiplayer game servers.
 

Zeh Don

New member
Jul 27, 2008
486
0
0
UnnDunn said:
You're attempting ... to describe the Azure feature in terms of current and previous network gaming technology, which is an understandable approach to take, but also inaccurate. Azure does not replace multiplayer game servers.
Sorry, but I disagree.

I described current Client/Server architecture, and explained how "cloud gaming" is different and how it functions in comparison to Client/Server. In doing so, I feel I also highlighted why Microsoft's claims are false. I'm happy to re-iterate if I wasn't terrible clear on any one point.

The Azure cloud service is a collection of server machines used for distributed computation. That's all "the cloud" is. Powerful enough, some of these servers run dozens upon dozens of Virtual Machines, trafficking the data-in and data-out feeds to the various services that Microsoft and it's partners have set up, due to the minimal bandwidth they're using.

Being as distributed computation is all that "the cloud" refers to, I'm going to have to request you explain the vast difference between the Azure service and "current and previous network gaming technology" - a difference that would enable Microsoft to supply "four times" the computational power to each Xbox One remotely, having the desired impact: a gaming machine four times as powerful as the Xbox One is when not connected to the cloud.

I look forward to your response.
 

Jenvas1306

New member
May 1, 2012
446
0
0
UnnDunn said:
Here's the post on Game Licensing [http://news.xbox.com/2013/06/license] and the one on Online Connectivity [http://news.xbox.com/2013/06/connected].

With Xbox One you can game offline for up to 24 hours on your primary console, or one hour if you are logged on to a separate console accessing your library. Offline gaming is not possible after these prescribed times until you re-establish a connection, but you can still watch live TV and enjoy Blu-ray and DVD movies.
This. This is the reason I feel compelled to laugh into anyones face who buys an XBone.
Most people have flatrates for their internet here, so having such a connection is not the problem, but its just really stupid.
A console that needs internet connection? that should be a feature, never a requirement.
 

Alexandre Lemke

New member
Jul 27, 2011
6
0
0
I tried OnLive, but they would not let me, because of my country.

If XBox One is really going to be dependent on cloud computing, and if PS4 is not going to, they are going to loose same markets.
 

Requia

New member
Apr 4, 2013
703
0
0
Dexter111 said:
Requia said:
The claim that cloud computing won't work for gaming because latency is frankly nonsense. If latency was a barrier to gaming online multiplayer wouldn't work either. Though it is a barrier for some, and it'll fuck up single player gaming for said people.

It's also worth noting that Minecraft servers are sometimes used to offload system intensive mods, and run mods you couldn't normally run, so the principle is sound.
You seem to have a basic misunderstanding of how normal Multiplayer games work and what in this instance is being referred to as ?cloud?.
*giant snip*
What you're describing isn't latency. Latency is the time it takes between the server sending a single packet, and the client receiving it (or vice versa, or more realistically in this case, the sum of both directions). What you've got there are bandwidth limitations, which is a separate (though valid) concern.
 

The White Hunter

Basment Abomination
Oct 19, 2011
3,888
0
0
Forlong said:
My PS3 uses cloud computing and asks me every time, if I want to use it. That is my choice. Microsoft wants to take all my choices away from me. That is what's wrong with their approach on cloud computing.
You're thinking of the cloud storage that everything does these days and that's pretty useful.

What they're claiming is server side rendering and such and the reason nobody is talking about it is that it doesn't work very well unless you stream the whole game, and as a result it's basically complete and total bullshit; most places don't have the bandwidth or connection speeds necessary for it to work adaquetly and I highly doubt that Microsoft will invest enough in servers, actual physical servers as opposed to virtual ones, to offer and significant bump in performance.

If they have more fool them, they'd have been better off just selling a more powerful machine.
 

UnnDunn

New member
Aug 15, 2006
237
0
0
Zeh Don said:
Being as distributed computation is all that "the cloud" refers to, I'm going to have to request you explain the vast difference between the Azure service and "current and previous network gaming technology" - a difference that would enable Microsoft to supply "four times" the computational power to each Xbox One remotely, having the desired impact: a gaming machine four times as powerful as the Xbox One is when not connected to the cloud.

I look forward to your response.
The difference lies in where the core game state and logic are calculated. In current client/server gaming models (e.g. Diablo III, MMOs, multiplayer FPSs and the like), the core game state and logic are kept on the server, with clients connecting to the server in order to participate in the game session.

With the cloud processing that Microsoft is touting, the core game state is kept on the Xbox One, with the console making requests to online services for various things that the game may or may not need to enhance the experience. This is more akin to how SimCity does it, except what Microsoft is proposing is much more involved.