After several design iterations and some prototypes, this is the approach I am taking by now to export Maya Shading Networks to OSG.
The shading network will be replicated in the C++ classes described in the following diagram.
... TO-DO...
The basic idea is that each ShadingNode object is able to generate the GLSL code needed for its work, so the implementation of new features or the export of new Maya shading nodes is transparent to the rest of the system.
ShadingNodeFactory is the class that builds the ShadingNode objects from the original Maya shading nodes. New shading nodes are currently manually registered in the factory. This should be replaced by a scheme where new nodes implemented autoregister themselves so there is no need to change the code in the factory.
Contribution of each light source is computed and all of them added to the final lighting. This computation depends on the kind of light source: directional, point or spot light.
Also, the use of bump mapping change the way lighting computations are done, as operations are performed in tangent space (aka texture space) instead of view space.
Moreover, vectors involved in lighting computations can be computed in the fragment shader (more accurate, but more expensive) or in the vertex shader and then interpolated between fragments (the cheap way). The difference in visual quality is to be analysed (the never ending TO-DO list :).
The vector parameters used in the lighting computations are:
Normal vector Light vector Viewer vector Half vector (between light and viewer)
These vectors will be in eye space when not computing bump mapping and in tangent (texture) space when computing bump mapping.
GLSL variable names and types:
... TO-DO
Bump maps in Maya are height maps. To be used in the GLSL shader we convert it to a normal map (aka DOT3 normal map), a color texture where the RGB values store the XYZ coordinates of the normal vector for each texel. The normal map texture is named like the original bump map file texture with the suffix
_nmap
. There are two ways of converting bump to normal maps: using the built-in code or using the NVIDIA Texture Tools (when available). The former generate a .png file and the latter a .dds file.
The existence of the normal map file is checked when exporting and it is only generated when it does not exist or it exists but its modification date is older than the modification date of the bump map file.
This code is under development and there are known bugs that are being addressed. To put it in other words: use it at your own risk!
WARNING: What happens when the same Maya fileTexture node is connected to the bump and color channel? Check it, and create a test model for this case.
TO-DO: The built-in conversion from bump map to normal map has a "hardwired bump scale". We need to determine what is the right scale coincident with the Maya bump behavior. This built-in method also has some aliasing problems. The NVIDIA Texture Tools generate DDS files that seem to have the Y-axis inverted (top-left origin). However, this DDS files do not show a much smoother normal map, without the aliasing problems above mentioned.
In order to avoid name clashing, I am following these criteria:
tex_
and followed by the maya DG node name.
* Variables are prefixed with
sn_
and followed by a token identifying the Shading Node type that creates them and then the name of the node and the variable name inside the node.
As all the GLSL code will be automatically generated by Maya2OSG, there should be no problems of name clashing, but it is critical to keep these criteria in mind when implementing new shading nodes or modifying the implementation of existing ones.
To add user customizations to the shaders, there are four parameters that file paths for GLSL code to be appended at the end of the vertex declarations (to include uniforms, varyings, functions, etc.), at the end of the vertex main() function (to compute varying output values) and the same for fragment shaders.
The custom code is added after the regular shader code, so computed variables can be overriden by custom shader code.
This feature make easy to use multiple render targets. In the fragment shader, the regular code will compute the gl_FragData[0] output and user custom code would compute other outputs (gl_FragData[1], ...).
Maya2OSG can be used in two ways: to export a whole scene to be used "as is" or to export models that will be used as individual pieces that compose the whole scene.
There is one important difference between these two uses, as there are some global uniform variables that must be set for the scene. These variables should be included in the whole scene but not when exporting individual pieces.
The most obvious example is the number of active lights in the scene, that is required by the GLSL shaders, but should not be included more than once in the scene, or at least it should be set out of the individual models, depending on the lighting state at each moment.