Post-processing effects

In this tutorial we will use code from section 1.5 and information from section 1.7 of this tutorial so it is highly recommended to read them before going any further.

I introduced post-processing materials in section 1.7 of this tutorial. However, description was very brief whereas post-processing should be treated as an important sub-system of the material system and therefore requires additional explanation. It will be provided in this section.

Post-processing materials in contrast to regular materials doesn't apply to single objects but to the scene as a whole after it has been rendered. That way all pixels of the rendered scene can be modified. There is no other way to this on current generation of GPUs than to do this in image space. That is scene has to be rendered (typically to a render target texture) and after that post-processing can be applied. Therefore real-time post-processing is similar to post-processing of movies and even post-processing photos! If you're familiar with applications like Photoshop or GIMP some of the effects might be already known to you. In post-processing you can perform water colour filtering, edge detection, image blurring, bloom amongst others.

nGENE Tech makes heavy use of post-processing effects. First of all, by default it makes use of deferred shading which can be thought as an image-processing technique. Furthermore such effects like water and clouds are also done completely or partially in a post-process stage!

What you basically have to understand is that post-processing gives you virtually unlimited opportunities and yet that it is often more efficient way of applying complex effects than doing them on the per object basis.

Typical post-processing material definition in a material library .xml file looks as follows:

<material name="grading_high_contrast" transparent="false" order="post">
    <technique name="main">
        <pass name="main">
            <texture name="BackBuffer" sampler="0"/>
            <texture name="hi_contrast" file="grading_hi_contrast.png" sampler="1" addressing_mode="ADDRESS_CLAMP"></texture>
            <vertex_shader file="diffuse.vsh">
                <constant name="matWorldViewProj" semantic="WorldViewProj"></constant>
            </vertex_shader>
            <pixel_shader file="colour_grade.psh" profile="ps_2_0">
            </pixel_shader>
        </pass>
    </technique>
</material>

It is a material enhancing contrast of the scene. There is one main difference to the 'regular' materials. Note the material order attribute's value. It is set to 'post'. Its default value is 'scene' and is used by the majority of nGENE materials. Setting it to 'post' notifies nGENE that it has to treat this material a bit differently and instead of applying it to a single object (as 'scene' implies) it will be applied to the whole scene.

Also worth noting is that we pass current frame buffer contents to this material so we can perform operations on it. This texture is named "BackBuffer" and you will see it in a number of places in nGENE material libraries.

In this tutorial we will apply inverse material to our scene. What this effect does is inverting colours (1.0 - value for each of the components) i.e. black becomes white, white becomes black, etc.

Provided you have scene set up and running, enhancing it by means of post-processing is a straightforward task which can be described by these steps:

  1. Create post-process material and set its properties
  2. Add effect to the PostProcessStage of Renderer

First add this line to the App.h file:

SPriorityMaterial postEffect;

I'll cover it in a second.

And also add below lines to the App.cpp file:

MaterialLibrary* pMatLib = MaterialManager::getSingleton().getLibrary(L"default");
Material* pMaterial = pMatLib->getMaterial(L"inverse");
 
postEffect.priority = 0;
postEffect.material = pMaterial;
postEffect.material->setEnabled(true);
 
PostStage* postStage = ((PostStage*)Renderer::getSingleton().getRenderStage(L"PostProcess"));
postStage->addToRender(&postEffect);

In nGENE all materials have to be assigned to something. As post-processing materials does not apply to single objects there is need for helper (dummy) SPriorityMaterial object to which it can bound (by material property which is pointer to the existing Material object). Besides by using this object it is possible to control the order in which the effects are applied. In our case it doesn't make any difference because we are using single post-processing material. However, think about it for the second. Applying first contrast enhancement filter and then bloom filter will often work differently from first applying bloom and then adjusting contrast of the scene. You can control the order by choosing appropriate priority values, where materials with lower values are applied first. So by setting it to 0 our material will be applied as first. Important thing to note is that if there are more materials with the same priority the order in which they are applied is undefined.

Then adding it to renderer is simple. Just obtain post-processing stage and then call its addToRender() method passing pointer to our just created material.

The results for this tutorial are presented below:

Tutorial_2_3_1.png

It makes a huge difference, doesn't it?

Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License