Advertisement

How is "pluginless/HTML5 gaming" different from "Cloud gaming"?

Started by November 01, 2012 10:46 AM
8 comments, last by DavidGArce1337 12 years ago
I was thinking last night about how HTML5/WebGL/Browser Games running from a server are any different than a game running using the whole "Cloud" computing thing, like Gaikai and other companies offering such a service.

Aren't the "Bad" and "Cons" the same for both of them? Which basicly made me think, why even bother with "HTML5 Technologies" when I can just use the "Cloud", make my app in "whatever that I know already" and have more advantages than the "alternative" with the exact same penalties.

Am I wrong here? If the "future" is server based apps and games that run everywhere, why even bother with a "specific" tech, like HTML5? Might aswell go the "Cloud" route, no?


Where am I getting at? This -> If you are going to make a Server dependant App or Game, think multiuser/multiplayer, why not just push the whole thing from it? Why limit yourself?

Maybe I am missing something...
HTML5 games run on the client, not on the server so latency and bandwidth is not an issue (unless you make it a multiplayer game ofcourse but even then you'd use far less bandwidth than you would with a game running entierly on the server).
[size="1"]I don't suffer from insanity, I'm enjoying every minute of it.
The voices in my head may not be real, but they have some good ideas!
Advertisement
One possible difference is in the hosting and bandwidth costs -- hosting for HTML5 games can be quite cheap, and providing the assets are kept to a smaller size (as is quite common) the bandwidth usage is fairly minimal and loading times are pretty good -- while cloud hosting can potentially be quite expensive, and depending on the configuration may use more bandwidth.


Although it's useful for them, HTML5 also isn't just for games.

- Jason Astle-Adams

I'm getting sick and tired of the term "cloud" ... can't people just say what it is - "remote hosting" ?!

I cannot remember the books I've read any more than the meals I have eaten; even so, they have made me.

~ Ralph Waldo Emerson


I'm getting sick and tired of the term "cloud" ... can't people just say what it is - "remote hosting" ?!


It's not quite the same thing, but I agree, cloud is very overused.
It's not just remote hosting, of course the cloud is network based like other previous technolies but the biggest thing is it scalable. A well written cloud application can scale on demand. So if your so-so web game gets only 10 users a day you pay for 10 users a day, but for some crazy reason you're featured by Yahoo on their front page and get 100k users a day, your application can scale.. etc.. Of course paying for it is up 2 u.. That and its virtualization is really the biggest benefits of the cloud.

-ddn
Advertisement
That's called scalable billing - and it's still on a remote host.

I cannot remember the books I've read any more than the meals I have eaten; even so, they have made me.

~ Ralph Waldo Emerson


It's not just remote hosting, of course the cloud is network based like other previous technolies but the biggest thing is it scalable. A well written cloud application can scale on demand. So if your so-so web game gets only 10 users a day you pay for 10 users a day, but for some crazy reason you're featured by Yahoo on their front page and get 100k users a day, your application can scale.. etc.. Of course paying for it is up 2 u.. That and its virtualization is really the biggest benefits of the cloud.

-ddn


Tbh, we had scalable services before the whole cloud craze, its just a buzzword and i agree with the people who find it annoying. (its like Web 2.0 all over again, only worse this time)
[size="1"]I don't suffer from insanity, I'm enjoying every minute of it.
The voices in my head may not be real, but they have some good ideas!
There's really a couple issues here:

First, multiplayer games are almost universally real-time, asynchronous, client-server architectures (Very few are p2p, or synchronous), even if the server process runs on one of the machines that someone else is also playing on.

Second, there are client-server games which might be hosted in the "cloud" with client components running on the local machine (which might be HTML5, Flash, Plugin, or stand-alone game application), and this is very different from services like OnLive or Gaiki, which run the entire game on their own infrastruture, compress the video stream, and send it down the inter-tubes. These are very different approaches.

The first is not much different than the now-current model of companies hosting official game servers, or third-parties renting servers to individuals -- its just that you're leasing your infrastructure from someone else. Other cloud services operate in a different way that's mostly-designed to run RESTful APIs and such, which is suitable to some types of games naturally (e.g. Synchronous games with a (possibly web-based) client), or could be made to work with other types if its engineered around that processing model.

The problem with you doing what Gaiki or OnLive does is that it takes rather a lot of infrastructure to pull off, and economically speaking, you probably need a large stable of releases to effectively use your hardware investment (If you buy a ton of infrastructure to handle the load at launch, 80% of them will be idle in a month unless there's another new game to put on them). There's also a problem with these services in that each video stream is different for each player, so it can't be cached anywhere on the network to reduce bandwidth -- this costs money, and at large enough scale could actually overload the current internet.

Companies like Netflix or Youtube that stream the same video to millions of users first distribute that movie across their data-centers, so that the feed has to traverse less distance to a user. This means less latency and less cost than a single datacenter, and is something Gaiki can and does do. But for content that's the same for everyone (movies, music, web-pages) network operators at the backbone have big servers caching content, so that if you and your neighbor are watching the same movie (say, because its a new release), the data comes from the cache, rather than Netflix's own servers, if its been cached. Beneath the backbone providers are regional providers who do the same, and local, retail service providers who also do the same. This means that the new release you're watching is more than likely coming from your local ISPs data hub, than from netflix's own servers. This saves costs for everyone, and makes optimal use of the whole internet. With a service like Gaiki, where each video stream is unique, it has to travel the entire distance from Gaiki's own datacenters to you, every time, every frame. If Gaikai had as many active users as Netflix *right now* it would probably exceed what the current internet is capable of delivering by more than 10 times.

This is why Gaiki needs lots of regional datacenters to be viable, and has been talking even to ISPs about deploying mini-gaikis into the datacenters of local ISPs.

Lots of people worry about streaming gaming and latency, but actually there's already as much or more latency inside your television set than there is to stream a video from a regional data center. Something like Gaiki will be successful, and only be successful, if they're able to push down into ever-more-local stratas of the internet, and in a way that is economically viable, but it's going to cost an awful lot.

I think that its very doable actually, but to make it cost-effective they're going to have to make fixed hardware platforms that start looking an awful lot like a server rack full of gaming console blade servers. The first step will be something like Calxeda's servers, with each node equipped with an AMD-style Fusion/APU, and 4GB of RAM. If you apply the blade-concept, then it becomes easy enough to maintain in the field by relatively-untrained techs at local ISPs who can just swap in a fresh blade and mail back deffective ones.

throw table_exception("(? ???)? ? ???");

So much info, but I understand it better now. (Excuse my lateness!)

One thing that this brought to mind is. With "cloud" gaming, the data sent to the users depends purely on the resolution, no?(If not, do correct me! jaja)

Meaning, say, If I can't stream a 240p video in real time. I can't play a 240p game in real time? And flash video runs at 24 FPS? Or used to...?
My math sucks and I know there is some computation to this...I guess my question(s) would be.

1- How does one convert resolutions(240p,360p,480p,720p,1080p,etc...) to data(?) to figure out how much "download speed" a user "needs" to run the "game" in 24FPS, 30 FPS and 60FPS?

And off the "cloud" topic,

2- How much data is sent to users usually on MMO's and FPS games? Since I can play MMO's and such, with my slow connection. Heck, I could play some MMO's on 32K Dial-up...

3- Adding to (2), HTML5 is a client based runner, correct? As in, the calculation would be the same if the complexity was the same if the "game" ran on HTML5(browser) or C++,C#,Java, etc... client? Because they are all client based systems, right?


(I will reply sooner this time! Thank you all!)

This topic is closed to new replies.

Advertisement