Advertisement

Modular 3d Engine Idea Feasability Question

Started by January 20, 2018 07:21 AM
5 comments, last by LorenzoGatti 6 years, 10 months ago

I'm wondering if I have this understood correctly:

A basic 3d game engine ONLY renders the modelsw tthemselves, nothing else.

Models a fairly crude looking by themselves.

Curved edges and other round appeaarences are accomplished through post-processing effects, such anti-aliasing etc.

New model and image enhancing effects are being developed all the time, thus needing new, built from the ground-up engines to be developed in order to support them.

Now comes my question.

What if the only thing the engine itself did was render those base models and geometry, and each of the post-processing effects came as pluggable, seperate "modules", so each time a new one was invented, a new module could be developed to accomodate them, thus avoiding the need to code a completely new engine.

Is this idea feasible?

Please let me know what would/wouldn't work about this idea, as well as anything I didn't/misunderstood

An engine is MUCH more than just showing a 3d model.  I like to think engines as having a whole loop. Input, update, and render. It's then up to you to put in the data. Model loading and displaying is one of the most basic things an engine can do. Many languages offer .obj loading with a single line of code. For example Javascript can do it using p5 with a single loadmodel() function. If you want to display it, then it's just 14 lines of code.

Often rendering something requires a sequence of events, and more than just "plugging in" something. However, both Unity and Blender use node editors for "plug in" effects. (Example picture is in the link). In blender's case each node is written in python and new ones are developed all the time. (I don't use Unity, but I think those nodes are written in C#)

I think what you are proposing is a thing engines already do.

 

 

Advertisement

Threads about engines are off-topic in Game Design. Moving.

-- Tom Sloper -- sloperama.com

You need to take a look at basic computer graphics on modern harware. Then you will see that you always need to specify not only the 3D model but also a program running on your graphics chip that tells the GPU how those collections of vertices/texture and normal coords are to be processed to get the completed model.

Those programs are the Shaders that are transpiled and translated to something the hardware could understand and work with. It is GLSL in OpenGL, SpirV in Vulkan and HLSL in DX that are the language standards. Those are then passed to a compile function that is part of the vendor specific hardware driver.

What most engines do is comming with a previous set of those shaders for some purposes like Diffuse Rendering, Pixel Shading or Alpha for Transparency. Anyone who is experienced (or in the above telled Node Based Editing case anyone) can then modify the render pipeline to achieve whatever effect he or she wants. Make a game in sepia look, just adjust the color channel; make a game in pixel graphics, just use a rasterization shader.

Some custom game engines (as mine) have the possibility to graphical change the complete render-pipeline so all you need is the editor tool and no needs for plugins here

On 1/19/2018 at 11:21 PM, dekronoth9 said:

 

New model and image enhancing effects are being developed all the time, thus needing new, built from the ground-up engines to be developed in order to support them.

You dont need a brand new engine just to support new rendering techniques.  An engine is a lot more than just rendering 3d meshes, and generally you can add new shaders, effects, or whatever to an existing engine with little trouble.

Professional game engines are typically already written with a certain degree of modularity in mind, but you also have to keep in mind that a game engine has a lot of parts that all need to communicate and work with each other and it has to perform as efficiently as possible.  This means that there's always going to be a lot of coupling between systems. 

Two more fundamental problems with "plugins" altering the appearance of what is rendered:

  • Duplication of work between normal and fancy rendering
  • Additional data required by fancy rendering, that is just missing in the basic model data.
    For example, smoothing a model given as triangles requires telling which mesh edges are smooth and which ones are creased

Omae Wa Mou Shindeiru

This topic is closed to new replies.

Advertisement