Advertisement

On Live

Started by March 25, 2009 05:36 AM
60 comments, last by ddn3 15 years, 7 months ago
Quote: Original post by Sirisian
Also something else I wanted to point out. I have zero interest in owning games. I play them once maybe for a few hours and then get bored and stop. Using my university's connection I can easily run this On Live stuff probably. I'd love to be able to rent games essentially and have fun for a little while. I know this differs from other gamers that believe they have to own the game or it isn't really theirs. I actually thought this way for a little bit then I bought L4D using steam and haven't regretted it.


I can very well see the appeal for a casual gamer, as well as access to a larger game pool, but for me I have zero interest in not owning games. Early on the service would have to cost quite a bit. At which point I would actually prefer to have been both currently entertained and actually have something tangible to show for it.

I'm on the other spectrum of things in that it often takes me a long time after getting a game to get into it. It took almost two years of owning it to really get into Oblivion. Currently if I go broke and can no longer afford silly services tomorrow, I still have a large catalog of games I've picked up along the way I can turn to. While they could burn or be stolen, they cant be turned off. Someone beyond me can not decide to flip a switch and cut me off for whatever reason suits their fancy.
------------------------------------------------------------- neglected projects Lore and The KeepersRandom artwork
Quote: Original post by ddn3
Yeah but there are a few things you can do to reduce that latency, you can run your local update rate at say 1000 ups


The problem isn't so much update rate as it is rendering speed. The reason why online games can push their updates higher is because they are headless and aren't constrained by the amount of time it takes to render the scene. For this system you have whatever your update time is PLUS the time required to render, encode and pipe down the line.

As I mentioned my rig is hardly troubled by Crysis cpu wise, yet graphically it is murdered by it; THAT is where your problems are going to appear. CPU processing and bandwidth really aren't a problem (capped services not withstanding); latency is, be it network or just the amount of time it takes to render.
Advertisement
In this video, they are explaining that they can do 1280x720 at 60 fps on a 5 Mbits/s connection.

The more I play with the numbers, the more frightening those numbers are :)

Uncompressed, it requires a bandwith of 1280x720x3x60 =~ 165 MB/s =~ 1320 Mbits/s

So they must achieve a compression ratio of 1:264 (1320/5) to fit those numbers.

All of that keeping an excellent image quality of course... Yeah, right..

Y.
Quotes from various 'hands on' articles:

---------------------------------------

'Spokespeople told me that it's a one-to-one GPU ratio. If I want to play Mirror's Edge, I'm using up that GPU at the data center.'
http://www.pcworld.com/article/161981-6/handson_with_onlive_streaming_gaming_service_at_gdc_09.html

'Once I was in the game itself, I immediately noticed the unwelcome signs of blocky compression. It wasn't so compressed that it was entirely distracting from the gameplay, but it was also worse than I expected. The visual quality was high, but the experience was marred by the considerable amount of splotchy pixels.
Playing around in Rapture, I found that response-time lag was mostly unnoticeable--mostly. When turning quickly, there were disappointing moments of hitching here and there. It was an impressive technical accomplishment, but at the same time unquestionably inferior to playing from a disc. '
http://www.shacknews.com/featuredarticle.x?id=1090

'We've tested a couple of games like Crysis, and Lego Batman. With Crysis, latency was high enough to annoy any FPS player. I've been told that the Crysis server was having issues, but still, that's what I saw today. Crysis' frame rate wasn't sky high. I'm eyeballing it towards 24fps. However, Lego Batman which is a much simpler game ran at more than 30fps (possibly 40-45) and that's good. A simple test that we did was to move the mouse cursor: even something that simple is fast but a bit jittery. You would not be able to draw something for example.
Image quality was OK, but you can notice immediately that there's compression involved. To be fair, that would not prevent me from using the service, but it is definitely not as sharp as a direct connection.'
http://www.ubergizmo.com/15/archives/2009/03/onlive_handson_at_gdc_09.html

--------------------------

Presumably this is all in fairly optimal conditions, although on the flipside it's also immature tech. So if it balances out and this is the standard experience... is it good enough?

For the community atmosphere, the convenience of renting games, and being able to play games my PC can't handle, I would go for this as an addition to, rather than replacement for, local hardware... if it was relatively inexpensive.

If the tech matures well and the internets infastructure becomes better able to support this kind of service this could become my main gaming platform... I think this will happen. Probably take a few years though.
Whether it's good enough really will depend upon their pricing. If it's no worse than watching a HD youtube video with all the artifacting and tearing then it's probably acceptable to the masses. If they have tiered subscription and pay as you use, so people can cap their expenses to their usage, they will probably succeed.

Personally I wouldn't mind using this service to try out new games which I might buy locally. It would be more convenient than downloading a demo, as long as it's reasonably priced. They are heralding a new way of gaming and opening up the market to an enormous amount of non-gamers, like the Wii. I suspect most MMO might go this route too, or a hybrid of it, providing not just access but streaming.

-ddn
Quote: Original post by ddn3
I suspect most MMO might go this route too, or a hybrid of it, providing not just access but streaming.

Yeah that's been discussed before online. One of the interesting things about it is that with that method you aren't limiting bandwidth to X entity updates/sec. It becomes a matter of just updating the screen so in this way a true MMO experience could be made potentially where everyone is on the same server. Not to mention resource sharing could be used so that you're basically just rendering different cameras into the same game world. Though I haven't done the calculations to see which is better or which scales better.

[Edited by - Sirisian on March 27, 2009 3:04:08 PM]
Advertisement
Looks like OnLive is trying to get rid of the criticism. In case anyone is still interested. Link
Ok so the compression question was answered, it still doesn't speak about the massive amount of computing required to run Crysis for every user they are streaming to, or the latency. It only addressed one of the many issues that were brought up.
Yeah, but those are not insolvable problems either. They can setup a network of computers in a cloud configuration, run multiple virtual OS on them, to increase efficiency. Crysis isn't really CPU bound (except when lots of physics are occurring), the bottleneck for it is the GPU, which I believe they are working directly with Nvidia to handle that. If I had to guess, they have a custom GPU solution tailored specifically for their purposes, ie using custom device drivers they can emulate multiple GPUs on a single high powered GPU solution ( maybe custom hardware or bank of the latest video cards ), this allows a single GPU to service multiple applications, thus reducing the need for a 1 to 1 hardware configuration per user.

-ddn
Quote: Original post by ddn3
If I had to guess, they have a custom GPU solution tailored specifically for their purposes, ie using custom device drivers they can emulate multiple GPUs on a single high powered GPU solution ( maybe custom hardware or bank of the latest video cards ), this allows a single GPU to service multiple applications, thus reducing the need for a 1 to 1 hardware configuration per user.


A regular GPU can already do that. You've never started more than one GL or D3D app at the same time? Doesn't mean that a GPU rendering two instances of Crysis won't be dividing it's resources in (at best) half to handle each instance. I don't see how they can run demanding games without a 1:1 hardware:client ratio.

This topic is closed to new replies.

Advertisement