UnnDunn said:
I'm sorry, but this post only serves to betray your ignorance, not Requia's. All of his comments are accurate.
I see. Well, follow along please:
Requia said:
What the fuck does Sim City have to do with it?
I mention SimCity because it's the primary example of a major single-player game apparently using "the cloud". And, of course,
it didn't actually use "the cloud" at all. As has already been covered extensively, distributed computation limits the practical applications for "cloud" uses in video games to less than the typical client-server architecture already provides.
Maxis and EA, for example, weren't able to derive any benefit from the system and so implemented none - they simply used it as a marketing buzz word in an attempt to cloak their DRM system. That's because, as I've mentioned, there aren't any real benefits that can be used practically.
Synchronisation issues are a major part of distributed computation - the type of computation that "the cloud" creates - as multiple parts of the program as a whole are computed at different times, and slotted together at the end user's machine. Microsoft are attempting to sell this to the mainstream under the magic that it will make your console fours times as powerful.
This is a lie. There are simply no applications under which this could be true.
Diablo III, the other primary example, also isn't "cloud based gaming". With Diablo III, your client-side sends small packets of information to the server, such as movement locations, who in turn merely validates it. A hiccup in the internet connection and server disallows an update tick - causing the prominent rubber banding issues. The only thing the server truly handles is loot drop calculations and damage calculations - and this is for the deployment of server side hot-fixes, which to date have been used simply to curb farming techniques that were deemed "too efficient". Some 99% of all calculations are running on your machine, with mere kilobytes of data used from the server.
Even with these pitiful requirements, Blizzard's servers imploded upon release under the strain of the sheer volume of simultaneous connections. And now, with less than 10% of the entire player base still playing, constant rubber banding, lag issues and disconnects are still quite common. This is because of the sync issues I've mentioned above - it's this example is literally using kilobytes. Imagine the issues when dealing with megabytes.
Increasing the speed of your internet connection increases the uses for this technology, however Microsoft have stated the required speed for an "optimal" Xbox One experience is a mere
1.5mbps. That isn't actually fast enough to handle Diablo III, let alone entire
frames worth of drawable data.
Requia said:
There are tens of thousands of games that run calculations server side, or on the machines of players other than yourself.
You're mistaking the client/server model for a cloud based model. World of Warcraft is not "cloud based" gaming. Nor is Planetside 2, though it is certainly closer than any other mainstream MMO.
The difference is in what the "cloud" is doing. With MMOs, like World of Warcraft, the player performs an action, and that action is sent to the server. The server validates it, updates the necessary information like enemy HP, and sends it back to all of the players who's machine requests it. The player's computer then calculates animation frames, physics, lighting, particle effects, etc., using a deterministic algorithm and then draws it out onto the screen, less the time generated by connection latency, and the then the player performs another action and the cycle repeats.
The connection latency is hidden by building a degree of latency into the game itself. World of Warcraft used this to great effect. Of course, it is impossible in some games - first person shooters most notably.
The deterministic algorithm ensures that everything is synced up visually. Google "Deterministic Engine" for a better understanding of how this works, as I've only had dealings in passing with these types of calculations. However, I still understand the concept - which is to perform calculations that will be the same result every time. Braid, for example, uses this as apart of it's "Rewind" function (most games do).
With a proper "cloud" game, the cloud isn't just keeping track of your position, adding damage numbers together or trafficking chat messages. It's performing some of the grunt work of the game itself, decreasing the load on the client machine as a result. It's really the key difference.
Things like lighting calculations, animation frames, particle effects, physics; the server is doing the work that the client itself would normally do. As a result, the client machine is freed from performing those calculations and is able to perform additional tasks, granting the program a greater resource pool.
Diablo III and SimCity do not run better on your machine as a result of their "cloud" features. That's, frankly, the biggest sign.
UnnDunn said:
Minecraft, when played online, does its lighting engine server side, so yes, you can do lighting server side (though I'm not sure why you'd want to, Minecraft only does because the server needs to know the light level for mob behavior).
Not quite. I can understand your confusion though, so I'll explain.
Minecraft's server is merely updating the current lighting condition of a tile when a player interacts with it. It's literally a scale system - 1 to 10, I presume - that is kept for each cube within the currently loaded area of the game world. This scale number is merely increased or decreased depending on how many light sources are passed on to it.
When you dig a block, you send a message to server to say "give more light to the blocks I just exposed" and it increases their scale number by one. It doesn't perform any calculations itself, it merely updates those numbers and passes them back to each player when their client machine requests them, which it then stores locally. The amount of information being passed is still quite small as a result, and the server is doing little more than mere integer based additions.
If Minecraft's server was a "cloud based lighting" system, then it would actually be calculating the light gradients for each vertex of each cube itself (well, Minecraft is voxel based I believe, but hopefully you get the idea), and passing
that information back to your machine, so all it has to do is draw it out onto the screen without any further calculations of any kind. Of course, the amount of information it would be passing around would be massive, as it would be passing that information back to your machine
every single frame. You'd need a fast connection for this to occur imperceptibly - a LAN connection would suffice, but the vast majority of internet connections would fail.
Titanfall, Respawn Entertainment's new game, is said to use the "cloud" for A.I. This is a clever miss-use of the term, as it's really just a client/server model handling bots - like Unreal Tournament, Counter-Strike or even Quake 3.
Microsoft are attempting to rebrand everything "online" or "server based" to merely "the cloud" in order to sell their new machine to the uneducated.
For example, you'll notice that in their press release, they state that your games are stored "on the cloud". They're not. They're stored on Microsoft's currently existing server network. If they were stored "on the cloud" you'd be able to stream them to your console free of charge, exactly like OnLive or Gaikai. But, you can't. You have to download them, and then you can play them.
I understand it's confusing, and I probably should've been nicer about it, so I aplogise for being rude - it really wasn't called for. I just get angry that Microsoft are lying to people, and people are defending those lies.
I hope that clears it up. If not, let me know what points are still confusing and I'll explain in greater detail.