On Live
The biggest problem I see with this is the thin client. Any input you give will have to travel to the server, be processed, and then sent back. You'll never escape the latency between the controller input and what actually happens on the screen. You'll be stuck playing games a few milliseconds in the future.
I think it's a joke. I can't believe anybody out there(*) is taking them seriously.
(*) Well, except for all those game journalists.
(*) Well, except for all those game journalists.
Quote: Original post by jpetrie
I think it's a joke. I can't believe anybody out there is taking them seriously.
You know mittens is.
In time the project grows, the ignorance of its devs it shows, with many a convoluted function, it plunges into deep compunction, the price of failure is high, Washu's mirth is nigh.
Wow, it's a complete lie.
Based on my experiences with guitar amp programs and ASIO, I'd say 5ms latency is about the maximum tolerable for "real time" playing. The ping to my first router hop is about 30ms.
Based on my experiences with guitar amp programs and ASIO, I'd say 5ms latency is about the maximum tolerable for "real time" playing. The ping to my first router hop is about 30ms.
Quote: Original post by drakostar
Wow, it's a complete lie.
Based on my experiences with guitar amp programs and ASIO, I'd say 5ms latency is about the maximum tolerable for "real time" playing. The ping to my first router hop is about 30ms.
Given that the input lag gamers are used to is at least one frame, if not two, and that a frame in a 30fps game (which is not the rate OnLive is using, but has been used by gamers for years) is 33ms, a tolerable lag could be up to 60ms.
IMHO your comparison to guitar amp programs is flawed because the human ear is much more sensitive to lag within [most] music due to the hard enforcement of a tempo and time signature.
The servers could be hooked up as an ISP-based service, so it would only be a few short hops into your ISP to get to an OnLive server, rather than travelling over the actual Internet; making 60ms attainable.
It'll definitely be interesting to see how well it works, they seem pretty confident about it so I'm sure they've tested it properly being as its not too far till release.
Quote: Original post by drakostar
Wow, it's a complete lie.
Based on my experiences with guitar amp programs and ASIO, I'd say 5ms latency is about the maximum tolerable for "real time" playing. The ping to my first router hop is about 30ms.
I completely agree with you in principle that internet latency will make all but the most mind numbing RPG's playable with the inherent input lag on this system, but thought I should point out that actual ASIO I/O latency varies from card to card, and is more than the internal ASIO latency you can change in the sound card controls... 5ms ASIO latency is internal latency, and is still stacked on top of roughly another 30+ ms that's built into the hardware elements.
Having said that, let's say you have exactly 30ms hardware latency for arguments sake, and add another 5 for internal... I'm imagining that not a whole lot of people will have less than 18ms ping to the servers. Plus the fact that there will also be another layer of hardware latency on top of all their claims. All in all, I'm betting this will be an epic failure.
cheers :)
The claim is 60fps, so lag-free gaming would requiring sending input and getting the result within a single 17ms frame. Not gonna happen. If they can install servers right at your ISP, they're looking at a reaction time of about 30-35ms, or two frames. Probably acceptable. A more realistic ping of about 50ms (eg, my ping to Google) is three frames, and probably noticeably sluggish. So your input runs at 20fps while the game is 60fps.
OK, maybe they're not just selling magic pixie dust, but realistic network infrastructure precludes this from working very well.
OK, maybe they're not just selling magic pixie dust, but realistic network infrastructure precludes this from working very well.
Forget latency and bandwidth, what kind of ridiculous server cluster would you need to run 10,000 unique instances of a game like Crysis, including rendering and compressing the display?
Did I miss something here or is this a crazy idea? Unless we're talking casual, resource-friendly games...
Did I miss something here or is this a crazy idea? Unless we're talking casual, resource-friendly games...
Quote: Original post by outRiderI also don't see how this can run on "low-end" hardware without a good graphics card. Presumably, they're going to be using MPEG-4 to compress the video (MPEG-2 would be "possible" but it would have either huge bandwidth requirements, or horrible compression artifacts) and a "normal" PC cannot decompress MPEG-4 in real-time without the assistance of a graphics card. Anybody who's tried to play a Blu-ray disc on Linux (which only very recently got support for hardware-accelerated MPEG-4) will be able to tell you that you need a beefy CPU to be able to decode HD MPEG-4 in real-time.
Forget latency and bandwidth, what kind of ridiculous server cluster would you need to run 10,000 unique instances of a game like Crysis, including rendering and compressing the display?
And while we're on that subject, decoding MPEG-4 is computationally intensive, but compressing it is even more so. I don't see how you could do it on a server for 1,000 people at the same time in real-time.
Anyway, if they've managed to do it, then good on 'em. I'll believe it when I see it, however.
Quote: Original post by outRiderIn which case, what would be the point?
Unless we're talking casual, resource-friendly games...
To be fair, I think they are claiming a proprietary compression scheme.
[Formerly "capn_midnight". See some of my projects. Find me on twitter tumblr G+ Github.]
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement