Advertisement

don't get too comfortable

Started by May 18, 2015 04:52 PM
102 comments, last by Icebone1000 9 years, 8 months ago

This. So much this. And considering how many developers are willing to drop to 30 for a "more cinematic experience" and claiming "the human eye can't tell the difference" rather than sacrifice that SSAO that makes those static screenshots pop... it's going to be a while.


Yeah, 30fps is actually tolerable for some games on a big TV, but VR really needs the framerate.
I'm really hoping VR gets popular. If it does, it will generate more of a demand for high framerate GPUs and subsequently, we might see the entire market shift towards higher end GPUs.

The current console generation are not going to come close to it though.


The current console generation is more then capable of doing it - as was the previous generation. Framerate has little to do with the hardware's capabilities at this point.

It all comes down to whether the devs are willing to make the graphical and technical sacrifices to reach higher framerates and as the last 10 years have shown us, no, they aren't.

Just for the hay of it, I tried one of those cardboard ones ...

I can't see the keyboard to play !!!

I cannot remember the books I've read any more than the meals I have eaten; even so, they have made me.

~ Ralph Waldo Emerson

Advertisement


It all comes down to whether the devs are willing to make the graphical and technical sacrifices to reach higher framerates and as the last 10 years have shown us, no, they aren't

I don't see that as a problem. Right now you can sell a 30fps AAA game, because people will still buy it.

With VR, 30fps just isn't an option, because your players will literally be sick. 60fps probably isn't an option either. And so, in order to actually sell games, any AAA titles built for VR will hit the 75-90 fps required by the headsets.

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

It all comes down to whether the devs are willing to make the graphical and technical sacrifices to reach higher framerates and as the last 10 years have shown us, no, they aren't

I don't see that as a problem. Right now you can sell a 30fps AAA game, because people will still buy it.

With VR, 30fps just isn't an option, because your players will literally be sick. 60fps probably isn't an option either. And so, in order to actually sell games, any AAA titles built for VR will hit the 75-90 fps required by the headsets.


Except that kind of IS the problem.

Right now, a AAA dev is not going to make a VR game. They just aren't. There isn't the user base. The best thing you're going to get is a AAA dev modify one of their existing games for VR. Which means you've got a game designed to run at 30 which now has to run at 75 while rendering each picture twice.

So you're either going to crank down all the graphical details so your non-VR version looks the same as the VR version - but inferior to the next CoD. Or you're going to have two graphics modes and people are going to wonder why you downgrade graphics in VR only. (Remember - users don't understand why an open world game can't look as good as a corridor game)

Personally I think the second scenario more likely - especially if the dev is making a PC port, so they may have the graphics scaling already built in for that platform. But neither scenario is likely until we get a killer app for VR that pushes adoption.

The current console generation is more then capable of doing it - as was the previous generation. Framerate has little to do with the hardware's capabilities at this point.

It all comes down to whether the devs are willing to make the graphical and technical sacrifices to reach higher framerates and as the last 10 years have shown us, no, they aren't.


That was my point. Saying "framerate has little to do with the hardware's capabilities" is patently false. Of course, you can drop graphical fidelity, but that doesn't really mean anything. I'm pretty sure my TV has enough computing power to run Quake1 at 60fps. But for current graphical standards, the current console generation is not up to the task of VR. Hell, most console titles aren't even 1080p.

I would be interested to see exactly what the difference would be if you were to scale the graphics options back until the PS4/Xbone can hit 60fps @ 1080p. Would it be a huge difference or only noticeable to graphics geeks?

But even then, your target isn't 60fps @ 1080, it's 75fps @ 2160x1200, which is 1.5 times the pixels per second. I rebuilt my PC last year to a pretty reasonable spec (i7, R9 290) and that just about managed a passable framerate in Elite on the DK2. I just don't see the consoles managing that.

if you think programming is like sex, you probably haven't done much of either.-------------- - capn_midnight

The current console generation is more then capable of doing it - as was the previous generation. Framerate has little to do with the hardware's capabilities at this point.

It all comes down to whether the devs are willing to make the graphical and technical sacrifices to reach higher framerates and as the last 10 years have shown us, no, they aren't.


That was my point. Saying "framerate has little to do with the hardware's capabilities" is patently false. Of course, you can drop graphical fidelity, but that doesn't really mean anything. I'm pretty sure my TV has enough computing power to run Quake1 at 60fps. But for current graphical standards, the current console generation is not up to the task of VR. Hell, most console titles aren't even 1080p.


But that's exactly my point. Nothing in the hardware prevents it from displaying an image 75 times a second. The only limitation is how quickly you can draw your image in the first place, which is mostly under the control of the developer. Sure, better hardware lets you draw more complex scenes faster, but you have control over how complex the scene is.

I would be interested to see exactly what the difference would be if you were to scale the graphics options back until the PS4/Xbone can hit 60fps @ 1080p. Would it be a huge difference or only noticeable to graphics geeks?

But even then, your target isn't 60fps @ 1080, it's 75fps @ 2160x1200, which is 1.5 times the pixels per second. I rebuilt my PC last year to a pretty reasonable spec (i7, R9 290) and that just about managed a passable framerate in Elite on the DK2. I just don't see the consoles managing that.


My guess would be you could get 360/PS3-level graphics displaying at 75fps at 4k resolutions on current hardware. Which is why they don't do it - because people didn't spend $400 on a new console to see the same "graphics" as last generation, even if it is at a higher framerate and resolution.

Consoles, again, can get away with it because you're sitting back from a TV, so downgraded resolution isn't as noticeable. Also, framerate lag is less noticeable as games are intentionally tuned slower to accommodate imprecise controller inputs and to allow for more natural animations. (Compare how slow CoD, Gears, and Halo are to older PC-only shooters like Quake and Unreal)

Those excuses/tricks won't work in VR though, as we've been pointing out smile.png
Advertisement


Those excuses/tricks won't work in VR though, as we've been pointing out :)

The framerate tricks won't but I'm not sure I see why the resolution tricks wont?

We need a 4k display on the rift to get rid of the screen-door effect, but there's nothing to say we have to render at 4k resolution. My guess would be that you could get away with rendering at 720p and upscaling, for everything except HUD elements...

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

Some art styles also require less polygons and lower-res textures, and less complicated shader effects. We might see more Nintendo-like art styles for VR.

I always expected that the headset itself would include some kind of hardware upscale the so that the hardware could just render at a lower resolution and just fire and forget. Guess I was wrong.
Yes, higher resolution reduces the screen door effect, but if you're upscaling 720p to 4k you're just going to have a blurry mess. It will just look as bad as the current DK2 (which is 1080p IIRC) but instead of seeing black lines it'll look like you smeared it with vasoline.

If you do need to upscale, I think it's more likely that the game will be rendered at a lower resolution, or using a progressive scan solution, and the UI will be rendered separately at max resolution. We can usually stand blurriness in the artwork better than text and HUD elements (depending on the art style of the game)

Edit: Derp - I just now saw your recommendation to not upscale HUD. I still think 720p to 4k is too large of a jump to be viable.

I agree with Servant - I don't think you can sacrifice frames or resolution, so you're going to see more "artsy" styles that are cheaper to render then going for a photo-realistic rendering.

This topic is closed to new replies.

Advertisement