XBone and cloud computing: why is nobody talking about this?

Recommended Videos

Atmos Duality

New member
Mar 3, 2010
8,473
0
0
Wha? I've gone on the rag many times about how "Cloud Computing" is basically "Server-centric" and "Always-Online DRM".

It's nothing new; it's just a fancy term for "Distributive Processing" and we've been doing it for decades.

Lately, marketing and PR are trying to use "Cloud" anywhere in place of a "Persistent Connection" and are trying to upsell it as a positive feature, betting on consumer ignorance to carry them past the stigma rightly associated with Always-Online.
 

UnnDunn

New member
Aug 15, 2006
237
0
0
OK guys, seriously, calm down. Everyone seems to be ragging on this by saying "the latency is too high..." It's not. Latency is only a factor for something that depends on direct player interaction, and even then, if the interaction is simple enough, the latency still won't matter much. Anything that happens off-screen or that doesn't depend on the player can be handled in "the cloud" and, with a little sleight-of-hand, you will be none the wiser.

For example, crowd animation and AI: send a little crowd status info to Azure, and it simulates complex crowd dynamics along with physics-based, highly individualized animations for every actor in the crowd. Result: the spectators in your basketball game look less like robots and more like an actual human crowd. You can't do that locally because it would be too resource-intensive. If your internet connection falters, the game drops back to canned animation routines and your crowd goes back to looking like robots. It isn't latency-sensitive because the player doesn't interact with the crowd (and even if they did, the game can simulate it locally for the 5 or 6 crowd members you're interacting with, while the other 10000+ crowd members are calculated using Azure.)

Another example: bullets shooting through glass. Send the trajectory and speed of the bullet up to Azure, have it calculate the physics of the glass shattering. The game knows in advance that the bullet is going to hit the glass, so in the 500ms it takes for the bullet to reach the glass, it can send the data, get the result and render the effect. No internet? Then you get the standard, canned "glass breaking" effect.

Combine enough of those kinds of things together, and you wind up with a fairly significant improvement in the quality of game experiences.
 

Something Amyss

Aswyng and Amyss
Dec 3, 2008
24,759
0
0
Dexter111 said:
In some of their PR material you could literally replace the word ?cloud? with ?magic? and it would lead to the same basic effect and make about as much sense
I'd actually prefer they USE magic, so at least we wouldn't have to put up with them pretending to be using some new shiny technology.
 

Something Amyss

Aswyng and Amyss
Dec 3, 2008
24,759
0
0
Atmos Duality said:
Wha? I've gone on the rag many times about how "Cloud Computing" is basically "Server-centric" and "Always-Online DRM".
And you're not alone.

I think "Why is nobody" is about on part with "Am I the only one?"
 

PoolCleaningRobot

New member
Mar 18, 2012
1,237
0
0
UnnDunn said:
OK guys, seriously, calm down. Everyone seems to be ragging on this by saying "the latency is too high..." It's not. Latency is only a factor for something that depends on direct player interaction, and even then, if the interaction is simple enough, the latency still won't matter much. Anything that happens off-screen or that doesn't depend on the player can be handled in "the cloud" and, with a little sleight-of-hand, you will be none the wiser.

For example, crowd animation and AI: send a little crowd status info to Azure, and it simulates complex crowd dynamics along with physics-based, highly individualized animations for every actor in the crowd. Result: the spectators in your basketball game look less like robots and more like an actual human crowd. You can't do that locally because it would be too resource-intensive. If your internet connection falters, the game drops back to canned animation routines and your crowd goes back to looking like robots. It isn't latency-sensitive because the player doesn't interact with the crowd (and even if they did, the game can simulate it locally for the 5 or 6 crowd members you're interacting with, while the other 10000+ crowd members are calculated using Azure.)

Another example: bullets shooting through glass. Send the trajectory and speed of the bullet up to Azure, have it calculate the physics of the glass shattering. The game knows in advance that the bullet is going to hit the glass, so in the 500ms it takes for the bullet to reach the glass, it can send the data, get the result and render the effect. No internet? Then you get the standard, canned "glass breaking" effect.

Combine enough of those kinds of things together, and you wind up with a fairly significant improvement in the quality of game experiences.
You imply 1) Microsoft will give users the choice to use the cloud as opposed to using it for always-on DRM and 2) Microsoft is actually capable of innovating anything as opposed to making shitty versions of things we already have

Even if some of these things work out, is it really worth it? You should read that Eurogamer article the first poster made. Your glass breaking example is completely thrown out the window because it relies on you XB1 sending to the servers first then bringing the data back. They specifically mention that things like collisions won't work (it doesn't matter if the game "knows ahead of time" its still too short, the delay could be 100,000 ms). Calculations and stable worlds might work but they already excluded things like lighting because its easier to do on the gpu. This also means the developer has wiggle the cloud stuff into their games. If they already developed for the pc and ps4 why would they waste time and money putting cloud features like more animations for a basketball game into the XB1 AND paying to maintain servers when they can just make it look shittier scale down the graphics?

The bottom line, is Microsoft is completely full of shit. 4 times the processing power? That's ridiculous. They'll never reach that. It doesn't even make since. Why pay to maintain these servers when they could have made a console with better processing power?
 

Atmos Duality

New member
Mar 3, 2010
8,473
0
0
UnnDunn said:
OK guys, seriously, calm down. Everyone seems to be ragging on this by saying "the latency is too high..." It's not. Latency is only a factor for something that depends on direct player interaction, and even then, if the interaction is simple enough, the latency still won't matter much. Anything that happens off-screen or that doesn't depend on the player can be handled in "the cloud" and, with a little sleight-of-hand, you will be none the wiser.

....

Combine enough of those kinds of things together, and you wind up with a fairly significant improvement in the quality of game experiences.
Cloud Processing cannot magic away the biggest problems of latency in gaming. More importantly, the value (or benefit to gamers) depends entirely on what is being processed.

Best case scenario: The situation you described where the Cloud is just processing non-essential fluff.

Worst case scenario: Cloud Processing will be used to process essential portions of the game program remotely to make the game and thus the player, reliant on their service.

Cloud Processing has potential to enhance the game experience, yes, but even greater potential to destroy that same experience by being a built-in liability. And no amount of graphical bling is worth that.

Zachary Amaranth said:
And you're not alone.

I think "Why is nobody" is about on part with "Am I the only one?"
You're right, it's probably just another one of those nonsensical thread hooks.
 

Zeh Don

New member
Jul 27, 2008
486
0
0
...Linus Blomberg (Avalanche CTO): The cloud functionality is pushed as a marketing tool to compensate for the less favorable hardware specs. I understand why they feel they need to do this, as the specs on paper aren?t necessarily representative of the actual performance. But the way it?s presented I feel is misleading at best. It?s just common sense that sending data over an internet connection isn?t even remotely comparable to sending data over a high-speed internal memory bus...
http://gamingbolt.com/interview-with-avalance-studios-cto-xbox-one-cloud-functionality-pushed-as-a-marketing-tool

It's already starting to be dismissed by Microsoft's own partners.
 

thesilentman

What this
Jun 14, 2012
4,513
0
0
Um, it is a big deal. What was already talked about is the assumption by MS that the cloud is going to make games run better.

What the fuck was my only response. I'll give my last post on this and then discuss your two points as I'm not in the mood to reexplain my line of thinking. It is wall of textish, so keep that in mind.

thesilentman said:
The "Cloud" can be used like this, but only for real data crunching. If it's for on site programs, forget it. I'd find the chat that I had with DoPo over this in the Linux group, but I can't find it for some reason. I will update if I do.

Main point? MS is a bunch of twats for thinking this. True data crunching is really the only practical reason, and I don't think MS wants games to stream at 2 MB/s from Xbox to Xbox. -.-

DoPo said:
Welp, sharing a random thought I had - it's about possible future for Linux gaming. How Linux can suddenly be Up There? with other platforms, on equal footing when it comes to playing stuff. And I don't know why I didn't think of it before but - streaming games. Yep, it's that easy - stream the games and it doesn't matter which platform you play them on.

And Linux already has that covered and has had it for...decades. OK, technically UNIX but whatever, Linux can do it too - this entire idea of streaming stuff is how stiff already operates. You don't have to be on your own machine to do stuff, you could just as well just have keyboard and a screen and get your session over SSH from another server. Or have an X session running on another machine but shown on your screen. Or numerous remote desktop variations. It's a complete non-issue to get a game displayed to us and send back control inputs, the only thing is network speed and a good enough game server to give us the performance. Network is being sorted out for us (technology marching on), so we only have to worry about what runs the games.

I've seen talks about streaming games (and there is OnLive, still, I thing, but dunno how good it is) but never actually connected to Linux or mentioned that we have the facilities to do it just lack some tech to support it properly. And it does seem like a viable direction to expand in - never heard much talk about that either. Get a beefy machine to run the games for you and you won't need specific consoles/PCs/whatever in the house since it can direct the output anywhere you want it to, on multiple screens if you wish and the controller just needs to be "whatever fits the game". So can cut out ports in one fell swoop - there needs to be just one version of the game.

I understand that there would also be difficulties but...it's still too little discussed thing, I believe. There is only OnLive but it's just...there - rarely if ever do I hear about it. When some 4G technologies were being discussed there was a brief mention of "Oh, that means you'll be able to stream games on mobile phones" and...that's about it, really. Now with PS4 we have a brief mention of streaming games but no real big hubbub of the possibilities. Just bizarre.
Lucem712 said:
@DoPo: I thought it'd be a pretty viable thing, or at least it will be in the near future, with a device sort of like Ubuntu phone's docking. You load up your OS via device to a screen, keyboard and mouse combo with a dock and from there are able to play any sort of media, high end games as well.

Though, last I heard the major hurdle was internet connections? I attempted to try OnLive and Gaia but my flash package wasn't up to date -_-
DoPo said:
@Lucem712: Internet connections are on their way to becoming a non-issue...well, OK not everywhere at all times but there have been places with stable 10Mb/s or so download speeds for a decade. Furthermore, you won't care much for internet connection if you deploy it over LAN at home and my impression is that everybody and their dog has easy access to home LAN solutions. And for, like, at least 5 years home wirelesses have been popular. It's a bit limited, yeah, but it's there, it can be used.
Me said:
@DoPo: Streaming and web-based games is how I see Linux growing as a viable alternative to Windows or da Mac. Sooner or later, it won't matter what most of us need to do as it will all be in the high and mighty cloud. I already see Google Docs up there, and a nice group word editor called Etherpad doing this. The future is heading more towards streaming and the almighty "cloud", something that pisses me off personally, but I respect in a sense.
DoPo said:
@thesilentman: Well...cloud is something else I've been musing about. It's not a bad idea, quite good in fact, and it sort of feels like a golden hammer...which isn't as good. However, it does have a great potential and for anybody working in technology, I'd suggest to go and start getting familiar with it. Myself included. For good or bad, it should become more and more relevant as you observed.

But at any rate I had an...interesting idea the other day - P2P cloud computing. Which has been around for a while in a way (well, all those "loan us CPU to compute a cure for cancer/Einstein's theorems/etc") but you could grow it out into an actual P2P computing. So like torrents but instead of using the network's hard drives, you can also use CPU as well. So more or less, you can loan CPU power to others ("seeding") and claim some back later ("leeching"). It's of limited usefulness but it can work in some situations. There are ways to do it, there are obstructions, to but it is possible. But then I realised it'd just a giant botnet to be operated by mostly anyone. Ugh, not a pleasant thought.
Me said:
@DoPo: Loaning CPU power? Ah wait a minute, let me visualize that.

....

Dang it. I can't. How would this work in practice?
DoPo said:
@thesilentman: Don't you know those projects about solving theorems or cancer research and stuff? How they work is you install their software and then specify how much CPU you want to give them - you could go for 10% or 80%, if you wish. Nowadays you can happily give them a core or two to use, I suppose, the first time I encountered them was around when WoW started out, so 2005-2006 when multicore CPUs were almost unheard of for normal users. But anyways, once you specify how much processing they can use, they'll just feed your computer small chunks of data to be processed and when finished they just get the result back. Wash, rinse, repeat. It's distributed computing really similar to the cloud but...not the cloud. Here is one I remember from back then. You can extend the idea to a P2P network, too - feed others the processing you can't do, they'll do it for you then spit back the results and voila. In the downtime you get fed processing requests.

It's not useful for all things, say, if you're playing a game you really need that data NOW and latency isn't really an option, however if you have to crunch through large volumes of data it can really speed things along, assuming, of course, you can split off the data in discrete chunks. Presumably each of these would need some time to be processed, so each could be a complex mathematical formula, for example or maybe you have a batch of images which need some automated manipulation that just takes time. Of course, you could just distribute the bruteforcing of a password or even a DDoS.
Me said:
@DoPo: Oh, I've heard of something like that. But I've always thought they'd work for more calculation based data than anything else. :-/
DoPo said:
@thesilentman: Yeah, not every type of computation would lend itself well to parallelisation but if it does, then it'd just be useful to have it.

I remember one guy giving a talk on the CUDA technology which allows you to run your software on nVidia cards. Now, even if you have, like 8GHz CPU, some software just doesn't need power, it needs more processes and the video card has that in spades - at the time (three years ago) you could easily pick up a card with 128 cores which would run any CPU to the dust with sheer parallelisation power. He had rewrote some software that simulated...earthquakes or volcano eruptions - one of the two, but some science lab somewhere was studying them and providing warning for a nearby area (I believe it was somewhere in Africa). So basically, they were in a high risk region and when the sensors picked up something that could possibly indicate the disaster, they'd run the software to see where and how it'd most likely hit. Problem is that on a normal CPU (well, probably not off the shelf but a bit more beefy). While the rewritten application, utilising some normal nVidia card, was still not realtime (he showed a demo - it advanced with about a frame or two a second) it would at least finish in just a few minutes as opposed to the 5-8 hours or so, the original needed.

So yeah. I'm pretty passive here and decided to listen more than suggest any ideas, but this can probably shed some light on something as it talks about the method that MS is trying to put into practice. I don't see it working after DoPo explained it, and after some further thought, I had to agree.

Any thing you guys want to discuss on this from the technological side? Just join us in the Escapist Linux group here. We're pretty lenient and all, but we can talk for hours on tech. And we have too. :-D
Jandau said:
First, people are making a big fuss over the "check in once every 24 hours" thing. Other people are saying it's no big deal. But the check-in is irrelevant - if major titles use cloud computing you'll HAVE to stay online to actually PLAY anything. You'll need a constant, stable, high speed connection or your game won't work. If you're bothered by the daily check-in you should be furious about this.
Pretty self explanatory. Now, if they would simply make the check one time only per a certain amount of days, then we could all live with it. But MS is too fucking vague. They're not really telling us anything other than "online authenciation, derp derp!"

I hate that, and all of the rumors surrounding the Xbox One (fuck all of those nicknames, this is important) don't have any substantial proof behind them. Nothing DIRECT from MS.

Secondly, how well is this going to work? Well, there are examples. Take OnLive, for instance. Or take SimCity. In general, cloud computing hasn't really got a very good track record. Even in the MMO genre it is endured because it's a necessary evil, but the lag and connection issues are still there in most modern MMOs. And now those same issues are about to be applied to single player games.
Not at all. The closest analogy is Steam, but even the almighty Big Brother Valve can't consistently make it without issues. Just search it up, and you can see.

In the end, whoever promises a cloud technology like this for games is severely bluffing, or is telling us something we don't know. The end result? Stay vigilant, and quit using their products once the BS is too much.
 

UnnDunn

New member
Aug 15, 2006
237
0
0
Atmos Duality said:
Cloud Processing cannot magic away the biggest problems of latency in gaming. More importantly, the value (or benefit to gamers) depends entirely on what is being processed.

Best case scenario: The situation you described where the Cloud is just processing non-essential fluff.

Worst case scenario: Cloud Processing will be used to process essential portions of the game program remotely to make the game and thus the player, reliant on their service.

Cloud Processing has potential to enhance the game experience, yes, but even greater potential to destroy that same experience by being a built-in liability. And no amount of graphical bling is worth that.
I think you are attributing far too much malice to developers/publishers.

Windows Azure is not free; the more developers use Azure, the more they have to pay. Using Azure for some sort of hokey always-on DRM system will cost them a lot of money (in terms of Azure compute time) for minimal benefit considering the console already handles perpetual license verification. The publisher's Azure compute budget is much better spent on functionality that will improve the game experience, rather than duplicating functionality that is already provided.
 

PoolCleaningRobot

New member
Mar 18, 2012
1,237
0
0
thesilentman said:
Pretty self explanatory. Now, if they would simply make the check one time only per a certain amount of days, then we could all live with it. But MS is too fucking vague. They're not really telling us anything other than "online authenciation, derp derp!"

I hate that, and all of the rumors surrounding the Xbox One (fuck all of those nicknames, this is important) don't have any substantial proof behind them. Nothing DIRECT from MS.
Where were you yesterday when the shit hit the fan? Microsoft confirmed a lot of this stuff on their blog. You have to check in once every 24 hours. I assume that means that a timer starts the second you disconnect.
 

thesilentman

What this
Jun 14, 2012
4,513
0
0
PoolCleaningRobot said:
thesilentman said:
Pretty self explanatory. Now, if they would simply make the check one time only per a certain amount of days, then we could all live with it. But MS is too fucking vague. They're not really telling us anything other than "online authenciation, derp derp!"

I hate that, and all of the rumors surrounding the Xbox One (fuck all of those nicknames, this is important) don't have any substantial proof behind them. Nothing DIRECT from MS.
Where were you yesterday when the shit hit the fan? Microsoft confirmed a lot of this stuff on their blog. You have to check in once every 24 hours. I assume that means that a timer starts the second you disconnect.
... Bwuh?

I've been staying away from anything Xbox One related news till E3. Can I get a link to this story? Maybe I'll have a different view once I read about all of it.
 

UnnDunn

New member
Aug 15, 2006
237
0
0
Here
thesilentman said:
PoolCleaningRobot said:
thesilentman said:
Pretty self explanatory. Now, if they would simply make the check one time only per a certain amount of days, then we could all live with it. But MS is too fucking vague. They're not really telling us anything other than "online authenciation, derp derp!"

I hate that, and all of the rumors surrounding the Xbox One (fuck all of those nicknames, this is important) don't have any substantial proof behind them. Nothing DIRECT from MS.
Where were you yesterday when the shit hit the fan? Microsoft confirmed a lot of this stuff on their blog. You have to check in once every 24 hours. I assume that means that a timer starts the second you disconnect.
... Bwuh?

I've been staying away from anything Xbox One related news till E3. Can I get a link to this story? Maybe I'll have a different view once I read about all of it.
Here's the post on Game Licensing [http://news.xbox.com/2013/06/license] and the one on Online Connectivity [http://news.xbox.com/2013/06/connected].

With Xbox One you can game offline for up to 24 hours on your primary console, or one hour if you are logged on to a separate console accessing your library. Offline gaming is not possible after these prescribed times until you re-establish a connection, but you can still watch live TV and enjoy Blu-ray and DVD movies.
 

PoolCleaningRobot

New member
Mar 18, 2012
1,237
0
0
thesilentman said:
PoolCleaningRobot said:
thesilentman said:
Pretty self explanatory. Now, if they would simply make the check one time only per a certain amount of days, then we could all live with it. But MS is too fucking vague. They're not really telling us anything other than "online authenciation, derp derp!"

I hate that, and all of the rumors surrounding the Xbox One (fuck all of those nicknames, this is important) don't have any substantial proof behind them. Nothing DIRECT from MS.
Where were you yesterday when the shit hit the fan? Microsoft confirmed a lot of this stuff on their blog. You have to check in once every 24 hours. I assume that means that a timer starts the second you disconnect.
... Bwuh?

I've been staying away from anything Xbox One related news till E3. Can I get a link to this story? Maybe I'll have a different view once I read about all of it.
Lol. Its on the front page bro.

http://www.escapistmagazine.com/forums/read/7.409900-Microsoft-Addresses-Xbox-One-Concerns

Here's a link to Major ****** Nelson's blog

http://majornelson.com/2013/06/06/details-on-xbox-one-connectivity-licensing-and-privacy-features/
 

klaynexas3

My shoes hurt
Dec 30, 2009
1,525
0
0
I just thought of this, and I know this isn't totally on the whole topic, but it's still a valid thing to bring up since people actually seem to think specs matter. If the "cloud" even works half as well as Microsoft claims it's going to be(but it won't even achieve that much, face it), that means that all the gaming is done on the cloud, not even done on the Xbone. If this is true, why do the specs on the Xbone even matter, considering it won't even need half the hardware it's boasting since all the actually computations are done elsewhere? Maybe I don't understand this "cloud" well enough to know, and someone can probably correct me, but it just seems like another thing that renders the whole console into bullshit, because all the gaming it is capable of doing should also be capable on systems that would be cheaper. You would just need something that can stream well, and near everything these days can stream, and boom, you're gaming Xbone style, with probably little to no difference.
 

UnnDunn

New member
Aug 15, 2006
237
0
0
klaynexas3 said:
I just thought of this, and I know this isn't totally on the whole topic, but it's still a valid thing to bring up since people actually seem to think specs matter. If the "cloud" even works half as well as Microsoft claims it's going to be(but it won't even achieve that much, face it), that means that all the gaming is done on the cloud, not even done on the Xbone. If this is true, why do the specs on the Xbone even matter, considering it won't even need half the hardware it's boasting since all the actually computations are done elsewhere? Maybe I don't understand this "cloud" well enough to know, and someone can probably correct me, but it just seems like another thing that renders the whole console into bullshit, because all the gaming it is capable of doing should also be capable on systems that would be cheaper. You would just need something that can stream well, and near everything these days can stream, and boom, you're gaming Xbone style, with probably little to no difference.
What you're proposing is OnLive [http://games.onlive.com/]. You should try it out; it actually plays pretty well. The problem is that it doesn't scale very well to do all of the computation "in the cloud".
 

klaynexas3

My shoes hurt
Dec 30, 2009
1,525
0
0
UnnDunn said:
klaynexas3 said:
I just thought of this, and I know this isn't totally on the whole topic, but it's still a valid thing to bring up since people actually seem to think specs matter. If the "cloud" even works half as well as Microsoft claims it's going to be(but it won't even achieve that much, face it), that means that all the gaming is done on the cloud, not even done on the Xbone. If this is true, why do the specs on the Xbone even matter, considering it won't even need half the hardware it's boasting since all the actually computations are done elsewhere? Maybe I don't understand this "cloud" well enough to know, and someone can probably correct me, but it just seems like another thing that renders the whole console into bullshit, because all the gaming it is capable of doing should also be capable on systems that would be cheaper. You would just need something that can stream well, and near everything these days can stream, and boom, you're gaming Xbone style, with probably little to no difference.
What you're proposing is OnLive [http://games.onlive.com/]. You should try it out; it actually plays pretty well. The problem is that it doesn't scale very well to do all of the computation "in the cloud".
I knew that was a thing, and maybe it's just me, but it seems almost like Microsoft is trying to do that, but with expensive hardware behind it so they have a reason to have a bigger price tag.
 

UnnDunn

New member
Aug 15, 2006
237
0
0
klaynexas3 said:
I knew that was a thing, and maybe it's just me, but it seems almost like Microsoft is trying to do that, but with expensive hardware behind it so they have a reason to have a bigger price tag.
Nah, Microsoft is still trying first and foremost to be a console gaming platform vendor. But they are trying to use their Windows Azure infrastructure as a differentiator.
 

Requia

New member
Apr 4, 2013
703
0
0
The claim that cloud computing won't work for gaming because latency is frankly nonsense. If latency was a barrier to gaming online multiplayer wouldn't work either. Though it is a barrier for some, and it'll fuck up single player gaming for said people.

It's also worth noting that Minecraft servers are sometimes used to offload system intensive mods, and run mods you couldn't normally run, so the principle is sound.