Advertisement

Square Enix new engine hype

Started by October 14, 2011 02:30 PM
27 comments, last by Geri 13 years ago

There aren't that many unbiased renderers in existence and to my knowledge no game engine uses it o.O


I was of the impression that path tracing was a common method of pre-rendering static lighting for game environments. Even if it's not common, it's certainly not "impossible" since there are a fair number of unbiased renderers out there and it's by no means impossible to use precomputed data from such a renderer in a game engine.
-~-The Cow of Darkness-~-
The new battlefield looks *good*, but it hardly looks 'realistic' -- take a still from battlefield and a real scene with similar content and see how easy it is to tell the difference. Take a scene from the square engine (preferably one without that aliasing artifact) and put it next to a photo of the reference material. Now what?

Photo-realistic lighting has been around for awhile, but not in real-time, and much less in game engines. Perhaps in that way the technique isn't 'revolutionary', but putting whatever techniques together in the way they have certainly has some fine-looking results. They haven't shared much about it, but real-time, dynamic GI -- or something approximating it with such good results -- would be a rather big achievement, especially if its fast enough to handle large, highly-detailed scenes.

Also, Square isn't really in the business of graphics technology -- They're not iD or Epic who have to keep showing new things off because its their business -- that fact alone makes me think more than anything else that they are doing something novel with this, otherwise I can't think of a reason they'd bother showing it off.

throw table_exception("(? ???)? ? ???");

Advertisement
Their GI looks great, and I don't see why anyone would downplay that. Getting a realistic GI bake and coupling it with balanced materials is a lot harder than it looks.
I just got the uncanny valley feeling from that video. They must be doing something really good. :mellow:

Their GI looks great, and I don't see why anyone would downplay that. Getting a realistic GI bake and coupling it with balanced materials is a lot harder than it looks.


Yes, absolutely, but like I said earlier, that doesn't represent any new technology, just good artistic decisions.
-~-The Cow of Darkness-~-
[quote name='MJP' timestamp='1318659484' post='4872759']Their GI looks great, and I don't see why anyone would downplay that. Getting a realistic GI bake and coupling it with balanced materials is a lot harder than it looks.
Yes, absolutely, but like I said earlier, that doesn't represent any new technology, just good artistic decisions.[/quote]Simply having good artistic directions isn't enough to get realistically balanced materials -- it requires R&D on the tech side too.

For example, if you're using lambertian diffuse in your BRDF, your angular falloff is going to be completely wrong (something art cannot fix) due to (e.g.) not taking microfacet inter-reflections into account, among other things. If you don't even know what BRDF you're using, then no matter how good your art is, you'll never be able to perfectly match photo-reference for all viewing/lighting situations. Or if you're using blinn-phong for specular (or worse, plain phong), then your highlights are probably just guesswork, which will never look quite right. Does your specular function take a micro-normal distribution into account? Is your art based on real-world refraction-indices/reflection-coefficient measurements? Are dielectrics and conductors shaded in the same way? Is your BRDF normalized and energy conserving?

If these kinds of questions are unanswered in your renderer, then even with the best artists, you won't be able to reproduce the real-world physical interactions of light and material.
Advertisement
I'm not downplaying the amount of work that a renderer needs to do, and if you don't have a renderer that gives you the flexibility to represent a range of real materials then of course you'll be limited artistically. I am saying that there's no evidence that there's anything new here with respect to baked lighting, nor does there appear to be specular lighting of any sort whatsoever.
-~-The Cow of Darkness-~-
nor does there appear to be specular lighting of any sort whatsoever.
There's balanced specular on everything in their video, and it's what makes it look realistic. Pick any pixel on the wall in the video and track it's motion - the colour will change.

There's balanced specular on everything in their video, and it's what makes it look realistic. Pick any pixel on the wall in the video and track it's motion - the colour will change.


I'm not at all convinced that the video shows this; I tried your suggestion (admittedly fairly briefly) on the regions I could find with the sharpest angle changes and I didn't really find anything that I could definitively say was not caused by the video compression. Additionally, I would expect there to be some kind of simulated exposure adjustment as well; you'd have to sort that out from actual specular lighting as well.

Maybe you could point out more specifically what you're seeing in the video (or reading about the video?) that demonstrates this?
-~-The Cow of Darkness-~-

I'm not at all convinced that the video shows this; I tried your suggestion (admittedly fairly briefly) on the regions I could find with the sharpest angle changes and I didn't really find anything that I could definitively say was not caused by the video compression. Additionally, I would expect there to be some kind of simulated exposure adjustment as well; you'd have to sort that out from actual specular lighting as well.


How did you not see it? If you watch just the right wall when there are angle changes you can see it pretty clearly.

This topic is closed to new replies.

Advertisement