Hello,
As you guessed, a material will not always use a single value for a material component (specular, roughness etc...) nor will it always use a texture or whatever. The diffuse will not always use only one texture either, you'll maybe sometimes want a complex combo of multiple texture with custom shader code. Think of a water material where you can have multiple normal maps translating over each other for example.
In my project I handled it in the same way that UE4 handle their material only that I have no node graph editor, it is entirely code based (Just like you did with xml so far). So first of all, I define the parameters exposed by the material to the engine in a similar way as this :
<Material name="BasicPBS" type="0">
<Parameters>
<Parameter name="myDiffuseTexture1" type="Texture2D" ... />
<Parameter name="myDiffuseTexture2" type="Texture2D" ... />
<Parameter name="someConstantBuffer" type="float" ... />
</Parameters>
</Material>
It is then known to the engine what every material expect as input which make things much easier (especially in DX12 where you have to create signatures that match your shader inputs etc...). It is also easy to dynamically generate the proper shader code for the input constant buffers with this information when compiling your shaders.
struct VertexOutput
{
#include <VERTEX_OUTPUT_LAYOUT>
};
In my case it is a simple #include that is replaced by the engine at compile time based on the parameters information in the XML file.
Obviously materials all have one thing in common, they will all need to be aware of the camera projection and few other thing. So those constant buffers are assumed in the engine and its not necessary to specify them in the material code. You are free to add as many information about your parameters as you want, this is just a basic example.
Here's the important part : the way I define the material components is a bit different from your. Instead of having only the option to specify a texture or a single value for the diffuse/normal/specular/roughness, it is instead shader code. UE4 also does this, except its a node graph editor, but you can do any shader operation within your materials. So my diffuse tag could for example look like this :
<diffuse>
float someVar = whatever;
...
some more hlsl code
...
output.Albedo = Sample(myDiffuseTexture1, someSampler, input.TextureCoodinates) * Sample(myDiffuseTexture2, someSampler, input.TextureCoodinates);
</diffuse>
You are free to create your own intermediate language if you want this to be compatible with both HLSL and GLSL but that is a lot of work. Another way would be to provide both version of the code but that is harder to maintain.
So at this point you would have everything you need to deal with it in your engine. Now when you compile your material into a pixel shader, it's easy for example to have a "template" shader that insert the shader code contained in your XML at the right place. Here's how that said template could look like :
//--------------------------------------------------------------------
// Defines the material structure.
//--------------------------------------------------------------------
struct MaterialData
{
float3 Albedo;
float Specular;
float3 Normal;
float Roughness;
float3 Emissive;
float OpacityMask;
};
//--------------------------------------------------------------------
// Entry point of the pixel shader.
//--------------------------------------------------------------------
PixelOutput PS(VertexOutput input)
{
PixelOutput PIXEL_OUTPUT;
MaterialData output;
// generated code.
#include <PIXEL_BODY>
// check for opacity and discard the pixel if needed.
if (output.OpacityMask < 1.0f)
{
discard;
}
// assign the render targets.
COLOR_RENDER_TARGET = float4(output.Albedo, 1.0f);
NORMAL_RENDER_TARGET = output.Normal;
return PIXEL_OUTPUT;
}
In that template, the #include <PIXEL_BODY> is replaced by the shader code of the material components.
And there we go, our materials can now be anything we want. Obviously we are forced to respect a standard in our XML file, for example assuming that we output to a variable named "output" which have a member called "Albedo". However this is the basic idea. I think this concept is very widely used and you can extend is as much as you want. For example, I omitted the part where the render targets are entirely customizable as well. My materials are also not only pixel shader based, you are allowed to entirely modify what happen in the vertex shader part as well.
If you have more question don't hesitate, I omitted a lot of information because it's hard to nail everything without writing an entire book about it.