Ogre 3D: Fixed Function Pipeline and Shaders

Exclusive offer: get 50% off this eBook here
OGRE 3D 1.7 Beginner's Guide

OGRE 3D 1.7 Beginner's Guide — Save 50%

Create real time 3D applications using OGRE 3D from scratch

$26.99    $13.50
by Felix Kerger | November 2010 | Beginner's Guides Open Source

This article, by Felix Kerger, author of Ogre 3D 1.7, is a continuation of the previous article Materials with Ogre 3D, in which we learned various details about materials with Ogre 3D.

In this article we will cover:

  • Render Pipeline
  • Writing a shader
  • Interpolating color values
  • Replacing the quad with a model
  • Making the model pulse on the x-axis

 

OGRE 3D 1.7 Beginner's Guide

OGRE 3D 1.7 Beginner's Guide

Create real time 3D applications using OGRE 3D from scratch

  • Easy-to-follow introduction to OGRE 3D
  • Create exciting 3D applications using OGRE 3D
  • Create your own scenes and monsters, play with the lights and shadows, and learn to use plugins
  • Get challenged to be creative and make fun and addictive games on your own
  • A hands-on do-it-yourself approach with over 100 examples

Images

        Read more about this book      

(For more resources on this subject, see here.)

Introduction

Fixed Function Pipeline is the rendering pipeline on the graphics card that produces those nice shiny pictures we love looking at. As the prefix Fixed suggests, there isn't a lot of freedom to manipulate the Fixed Function Pipeline for the developer. We can tweak some parameters using the material files, but nothing fancy. That's where shaders can help fill the gap. Shaders are small programs that can be loaded onto the graphics card and then function as a part of the rendering process. These shaders can be thought of as little programs written in a C-like language with a small, but powerful, set of functions. With shaders, we can almost completely control how our scene is rendered and also add a lot of new effects that weren't possible with only the Fixed Function Pipeline.

Render Pipeline

To understand shaders, we need to first understand how the rendering process works as a whole. When rendering, each vertex of our model is translated from local space into camera space, then each triangle gets rasterized. This means, the graphics card calculates how to represent the model in an image. These image parts are called fragments. Each fragment is then processed and manipulated. We could apply a specific part of a texture to this fragment to texture our model or we could simply assign it a color when rendering a model in only one color. After this processing, the graphics card tests if the fragment is covered by another fragment that is nearer to the camera or if it is the fragment nearest to the camera. If this is true, the fragment gets displayed on the screen. In newer hardware, this step can occur before the processing of the fragment. This can save a lot of computation time if most of the fragments won't be seen in the end result. The following is a very simplified graph showing the pipeline:

With almost each new graphics card generation, new shader types were introduced. It began with vertex and pixel/fragment shaders. The task of the vertex shader is to transform the vertices into camera space, and if needed, modify them in any way, like when doing animations completely on the GPU. The pixel/fragment shader gets the rasterized fragments and can apply a texture to them or manipulate them in other ways, for example, for lighting models with an accuracy of a pixel.

Time for action – our first shader application

Let's write our first vertex and fragment shaders:

  1. In our application, we only need to change the used material. Change it to MyMaterial13. Also remove the second quad:

    manual->begin("MyMaterial13", RenderOperation::OT_TRIANGLE_LIST);

  2. Now we need to create this material in our material file. First, we are going to define the fragment shader. Ogre 3D needs five pieces of information about the shader:
    • The name of the shader
    • In which language it is written
    • In which source file it is stored
    • How the main function of this shader is called
    • In what profile we want the shader to be compiled
  3. All this information should be in the material file:

    fragment_program MyFragmentShader1 cg
    {
    source Ogre3DBeginnersGuideShaders.cg
    entry_point MyFragmentShader1
    profiles ps_1_1 arbfp1
    }

  4. The vertex shader needs the same parameter, but we also have to define a parameter that is passed from Ogre 3D to our shader. This contains the matrix that we will use for transforming our quad into camera space:

    vertex_program MyVertexShader1 cg
    {
    source Ogre3DBeginnerGuideShaders.cg
    entry_point MyVertexShader1
    profiles vs_1_1 arbvp1

    default_params
    {
    param_named_auto worldViewMatrix worldviewproj_matrix
    }
    }

  5. The material itself just uses the vertex and fragment shader names to reference them:

    material MyMaterial13
    {
    technique
    {
    pass
    {
    vertex_program_ref MyVertexShader1
    {
    }
    fragment_program_ref MyFragmentShader1
    {
    }
    }
    }
    }

  6. Now we need to write the shader itself. Create a file named Ogre3DBeginnersGuideShaders.cg in the media\materials\programs folder of your Ogre 3D SDK.
  7. Each shader looks like a function. One difference is that we can use the out keyword to mark a parameter as an outgoing parameter instead of the default incoming parameter. The out parameters are used by the rendering pipeline for the next rendering step. The out parameters of a vertex shader are processed and then passed into the pixel shader as an in parameter. The out parameter from a pixel shader is used to create the final render result. Remember to use the correct name for the function; otherwise, Ogre 3D won't find it. Let's begin with the fragment shader because it's easier:

    void MyFragmentShader1(out float4 color: COLOR)

  8. The fragment shader will return the color blue for every pixel we render:

    {
    color = float4(0,0,1,0);
    }

  9. That's all for the fragment shader; now we come to the vertex shader. The vertex shader has three parameters—the position for the vertex, the translated position of the vertex as an out variable, and as a uniform variable for the matrix we are using for the translation:

    void MyVertexShader1(
    float4 position : POSITION,
    out float4 oPosition : POSITION,
    uniform float4x4 worldViewMatrix)

  10. Inside the shader, we use the matrix and the incoming position to calculate the outgoing position:

    {
    oPosition = mul(worldViewMatrix, position);
    }

  11. Compile and run the application. You should see our quad, this time rendered in blue.

What just happened?

Quite a lot happened here; we will start with step 2. Here we defined the fragment shader we are going to use. Ogre 3D needs five pieces of information for a shader. We define a fragment shader with the keyword fragment_program, followed by the name we want the fragment program to have, then a space, and at the end, the language in which the shader will be written. As for programs, shader code was written in assembly and in the early days, programmers had to write shader code in assembly because there wasn't another language to be used. But also, as with general programming language, soon there came high-level programming to ease the pain of writing shader code. At the moment, there are three different languages that shaders can be written in: HLSL, GLSL, and CG. The shader language HLSL is used by DirectX and GLSL is the language used by OpenGL. CG was developed by NVidia in cooperation with Microsoft and is the language we are going to use. This language is compiled during the start up of our application to their respective assembly code. So shaders written in HLSL can only be used with DirectX and GLSL shaders with OpenGL. But CG can compile to DirectX and OpenGL shader assembly code; that's the reason why we are using it to be truly cross platform. That's two of the five pieces of information that Ogre 3D needs. The other three are given in the curly brackets. The syntax is like a property file—first the key and then the value. One key we use is source followed by the file where the shader is stored. We don't need to give the full path, just the filename will do, because Ogre 3D scans our directories and only needs the filename to find the file.

Another key we are using is entry_point followed by the name of the function we are going to use for the shader. In the code file, we created a function called MyFragmentShader1 and we are giving Ogre 3D this name as the entry point for our fragment shader. This means, each time we need the fragment shader, this function is called. The function has only one parameter out float4 color : COLOR. The prefix out signals that this parameter is an out parameter, meaning we will write a value into it, which will be used by the render pipeline later on. The type of this parameter is called float4, which simply is an array of four float values. For colors, we can think of it as a tuple (r,g,b,a) where r stands for red, g for green, b for blue, and a for alpha: the typical tuple to description colors. After the name of the parameter, we got a : COLOR. In CG, this is called a semantic describing for what the parameter is used in the context of the render pipeline. The parameter :COLOR tells the render pipeline that this is a color. In combination with the out keyword and the fact that this is a fragment shader, the render pipeline can deduce that this is the color we want our fragment to have.

The last piece of information we supply uses the keyword profiles with the values ps_1_1 and arbfp1. To understand this, we need to talk a bit about the history of shaders. With each generation of graphics cards, a new generation of shaders have been introduced. What started as a fairly simple C-like programming language without even IF conditions are now really complex and powerful programming languages. Right now, there are several different versions for shaders and each with a unique function set. Ogre 3D needs to know which of these versions we want to use. ps_1_1 means pixel shader version 1.1 and arbfp1 means fragment program version 1. We need both profiles because ps_1_1 is a DirectX specific function set and arbfp1 is a function subset for OpenGL. We say we are cross platform, but sometimes we need to define values for both platforms. All subsets can be found at http://www.ogre3d.org/docs/manual/manual_18.html. That's all needed to define the fragment shader in our material file. In step 3, we defined our vertex shader. This part is very similar to the fragment shader definition code; the main difference is the default_params block. This block defines parameters that are given to the shader during runtime. param_named_auto defines a parameter that is automatically passed to the shader by Ogre 3D. After this key, we need to give the parameter a name and after this, the value keyword we want it to have. We name the parameter worldViewMatrix; any other name would also work, and the value we want it to have has the key worldviewproj_matrix. This key tells Ogre 3D we want our parameter to have the value of the WorldViewProjection matrix. This matrix is used for transforming vertices from local into camera space. A list of all keyword values can be found at http://www.ogre3d.org/docs/manual/manual_23.html#SEC128. How we use these values will be seen shortly.

Step 4 used the work we did before. As always, we defined our material with one technique and one pass; we didn't define a texture unit but used the keyword vertex_program_ref. After this keyword, we need to put the name of a vertex program we defined, in our case, this is MyVertexShader1. If we wanted, we could have put some more parameters into the definition, but we didn't need to, so we just opened and closed the block with curly brackets. The same is true for fragment_program_ref.

Writing a shader

Now that we have defined all necessary things in our material file, let's write the shader code itself. Step 6 defines the function head with the parameter we discussed before, so we won't go deeper here. Step 7 defines the function body; for this fragment shader, the body is extremely simple. We created a new float4 tuple (0,0,1,0), describes the color blue and assigns this color to our out parameter color. The effect is that everything that is rendered with this material will be blue. There isn't more to the fragment shader, so let's move on to the vertex shader. Step 8 defines the function header. The vertex shader has 3 parameters— two are marked as positions using CG semantics and the other parameter is a 4x4 matrix using float4 as values named worldViewMatrix. Before the parameter type definition, there is the keyword uniform.

Each time our vertex shader is called, it gets a new vertex as the position parameter input, calculates the position of this new vertex, and saves it in the oPosition parameter. This means with each call, the parameter changes. This isn't true for the worldViewMatrix. The keyword uniform denotes parameters that are constant over one draw call. When we render our quad, the worldViewMatrix doesn't change while the rest of the parameters are different for each vertex processed by our vertex shader. Of course, in the next frame, the worldViewMatrix will probably have changed. Step 9 creates the body of the vertex shader. In the body, we multiply the vertex that we got with the world matrix to get the vertex translated into camera space. This translated vertex is saved in the out parameter to be processed by the rendering pipeline. We will look more closely into the render pipeline after we have experimented with shaders a bit more.

Texturing with shaders

We have painted our quad in blue, but we would like to use the previous texture.

Time for action – using textures in shaders

  1. Create a new material named MyMaterial14. Also create two new shaders named MyFragmentShader2 and MyVertexShader2. Remember to copy the fragment and vertex program definitions in the material file. Add to the material file a texture unit with the rock texture:

    texture_unit
    {
    texture terr_rock6.jpg
    }

  2. We need to add two new parameters to our fragment shader. The first is a two tuple of floats for the texture coordinates. Therefore, we also use the semantic to mark the parameter as the first texture coordinates we are using. The other new parameter is of type sampler2D, which is another name for texture. Because the texture doesn't change on a per fragment basis, we mark it as uniform. This keyword indicates that the parameter value comes from outside the CG program and is set by the rendering environment, in our case, by Ogre 3D:

    void MyFragmentShader2(float2 uv : TEXCOORD0,
    out float4 color : COLOR,
    uniform sampler2D texture)

  3. In the fragment shader, replace the color assignment with the following line:

    color = tex2D(texture, uv);

  4. The vertex shader also needs some new parameters—one float2 for the incoming texture coordinates and one float2 as the outgoing texture coordinates. Both are our TEXCOORD0 because one is the incoming and the other is the outgoing TEXCOORD0:

    void MyVertexShader2(
    float4 position : POSITION,
    out float4 oPosition : POSITION,
    float2 uv : TEXCOORD0,
    out float2 oUv : TEXCOORD0,
    uniform float4x4 worldViewMatrix)

  5. In the body, we calculate the outgoing position of the vertex:

    oPosition = mul(worldViewMatrix, position);

  6. For the texture coordinates, we assign the incoming value to the outgoing value:

    oUv = uv;

  7. Remember to change the used material in the application code, and then compile and run it. You should see the quad with the rock texture.

OGRE 3D 1.7 Beginner's Guide Create real time 3D applications using OGRE 3D from scratch
Published: November 2010
eBook Price: $26.99
Book Price: $44.99
See more
Select your format and quantity:
        Read more about this book      

(For more resources on this subject, see here.)

What just happened?

Step 1 just added a texture unit with the rock texture, nothing fancy. Step 2 added a float2 for saving the texture coordinates; also we are using sampler2D for the first time. sampler2D is just the name for a two-dimensional texture lookup function, and because it doesn't change per fragment and comes from outside the CG program, we declared it uniform. Step 3 used the tex2D function, which takes a sampler2D and float2 as the input parameter and returns a color as float4. This function uses the float2 as the position to retrieve a color from the sampler2D object and returns this color. Basically, it's just a lookup in the texture for the given coordinates. Step 4 added two texture coordinates to the vertex shader—one as incoming and one as outgoing. Step 5 assigned the incoming to the outgoing parameter. The magic happens in the render pipeline.

What happens in the render pipeline?

Our vertex shader gets each vertex and transforms it into camera space. After all vertices have gone through this transformation, the render pipeline sees which vertices form a triangle and then rasterizes them. In this process, the triangles get split into fragments. Each fragment is a candidate for becoming pixels on the screen. It will become pixels if it's not covered by another fragment and therefore can't be seen. During this process, the render pipeline interpolates the vertex data like texture coordinates over each fragment. After this process, each fragment has its own texture coordinate and we used this to look up the color value from the texture. The following image is an example of a quad, which is represented by four fragments. Each fragment has its own texture coordinates. It also shows how we can imagine the texture coordinates, related to the pixels. In the real world, this depends on the render pipeline and can change, but this is a helpful model we can think with, even if it's not 100 percent accurate.

The same interpolation is used when we assign each vertex a color. Let's investigate this effect a bit more.

Have a go hero – combining color and texture coordinates

Create a new vertex and fragment shader called MyVertexShader3 and MyFragmentShader3 respectively. The fragment shader should render everything in green and the vertex shader should calculate the position of the vertex in camera space and simply pass the texture coordinates to the fragment shader. The fragment shader doesn't do anything with them yet, but we will need them later.

Interpolating color values

To see the effect of interpolation better, let's replace the texture with colors.

Time for action – using colors to see interpolation

To see how color interpolation works we need to change our code a bit.

  1. Again, copy the material and make sure to adjust all names.
  2. The only thing we need to change in the material is that we don't need a texture unit. We can just delete it.
  3. In the application code, we need to replace the textureCoord() with color():

    manual->position(5.0, 0.0, 0.0);
    manual->color(0,0,1);
    manual->position(-5.0, 10.0, 0.0);
    manual->color(0,1,0);
    manual->position(-5.0, 0.0, 0.0);
    manual->color(0,1,0);
    manual->position(5.0, 10.0, 0.0);
    manual->color(0,0,1);

  4. The vertex shader also needs some adjustments. Replace the two texture coordinate parameters with color parameters and also change the assignment line:

    void MyVertexShader4(
    float4 position : POSITION,
    out float4 oPosition : POSITION,
    float4 color :COLOR,
    out float4 ocolor :COLOR,
    uniform float4x4 worldViewMatrix)
    {
    oPosition = mul(worldViewMatrix, position);
    ocolor = color;
    }

  5. The fragment shader now has two color parameters—one incoming and one outgoing:

    void MyFragmentShader4( float4 color : COLOR,
    out float4 oColor : COLOR)
    {
    oColor = color;
    }

  6. Compile and run the application. You should see the quad with the right side blue and the left side green and the colors should fade into each other in between.

What just happened?

In step 3, we saw another function of the manual object, namely, adding color to a vertex using three float values for red, green, and blue. Step 4 replaced the texture coordinates with color parameters—this time we wanted colors not textures. The same is true for step 5. This example wasn't really difficult or exciting, but it shows how interpolation works. This gives us a better understanding of how the vertex and fragment shader also work together.

OGRE 3D 1.7 Beginner's Guide Create real time 3D applications using OGRE 3D from scratch
Published: November 2010
eBook Price: $26.99
Book Price: $44.99
See more
Select your format and quantity:
        Read more about this book      

(For more resources on this subject, see here.)

Replacing the quad with a model

The quad, as an object for experimentation, gets a bit boring, so let's replace it with the Sinbad model.

Time for action – replacing the quad with a model

Using the previous code we will now use Sinbad instead of a quad.

  1. Delete all the code for the quad; just leave the scene node creation code in place.
  2. Create an instance of Sinbad.mesh, attach it to the scene node, and use the MaterialManager to set the material of the entity to MyMaterial14:

    void createScene()
    {
    Ogre::SceneNode* node = mSceneMgr->getRootSceneNode()-
    >createChildSceneNode("Node1");
    Ogre::Entity* ent = mSceneMgr->createEntity("Entity1","Sinbad.
    mesh");
    ent->setMaterialName("MyMaterial14");
    node->attachObject(ent);
    }

  3. Compile and run the application; because MyMaterial14 uses the rock texture, Sinbad will be made out of rock.

What just happened?

Everything that has happened here should be familiar to you. We created an instance of a model, attached it to a scene node, and changed the material to MyMaterial14.

Making the model pulse on the x-axis

Up until now, we only worked with the fragment shader. Now it's time for the vertex shader.

Time for action – adding a pulse

Adding a pulse to our model is quite easy and only needs some changes to our code.

  1. This time, we only need a new vertex shader because we are going to use the existing fragment shader. Create a new vertex shader named MyVertexShader5 and use it in the new material MyMaterial17, but use MyFragmentShader2 because this shader only textures our model and nothing more:

    material MyMaterial17
    {
    technique
    {
    pass
    {
    vertex_program_ref MyVertexShader5
    {
    }
    fragment_program_ref MyFragmentShader2
    {
    }
    texture_unit
    {
    texture terr_rock6.jpg
    }
    }
    }
    }

  2. The new vertex shader is the same as the ones we've seen before; just add a new parameter in the default_params block called pulseTime that gets the value from the time keyword:

    vertex_program MyVertexShader5 cg
    {
    source Ogre3DBeginnerGuideShaders.cg
    entry_point MyVertexShader5
    profiles vs_1_1 arbvp1

    default_params
    {
    param_named_auto worldViewMatrix worldviewproj_matrix
    param_named_auto pulseTime time
    }
    }

  3. We don't need to change anything in the application itself. The only thing left to do is to create the new vertex shader. MyVertexShader5 is based on MyVertexShader3. Just add a new line that multiplies the x value of the oPosition variable with (2+sin(pulseTime)):

    void MyVertexShader5( uniform float pulseTime,
    float4 position : POSITION,
    out float4 oPosition : POSITION,
    float2 uv : TEXCOORD0,
    out float2 oUv : TEXCOORD0,
    uniform float4x4 worldViewMatrix)
    {
    oPosition = mul(worldViewMatrix, position);
    oPosition.x *= (2+sin(pulseTime));
    oUv = uv;
    }

  4. Compile and run the application. You should see Sinbad pulsing on the x-axis between his normal width and the threefold of his width.

What just happened?

We made the model pulse on the x-axis. We needed a second parameter for the vertex shader, which contains the current time. We used the sine of the time with two added to get a value between 1 and 3, with which we multiplied the x part of each translated vertex of the model. In action, this changes the position of each single vertex in each frame a bit, creating the effect of pulsing. Using this technique, we can practically pass any data into a shader to modify its behavior. This is the basis for a lot of effects used in games.

Summary

In this artile we covered:

  • Render Pipeline
  • Writing a shader
  • Interpolating color values
  • Replacing the quad with a model
  • Making the model pulse on the x-axis

Further resources on this subject:


About the Author :


Felix Kerger

Felix Kerger is a Computer Science Student at the Technical University of Darmstadt and has been developing 3D real-time applications using OGRE 3D for more than 5 years. He has given several talks on software development and 3D real-time applications at different conferences and has been working for three years as an assistant researcher at the Fraunhofer Institute for Computer Graphics Research. He also works as a freelance journalist and reports yearly from the Game Developer Conference Europe.

Books From Packt


Blender 3D 2.49 Architecture, Buildings, and Scenery
Blender 3D 2.49 Architecture, Buildings, and Scenery

Unity 3D Game Development by Example Beginner's Guide
Unity 3D Game Development by Example Beginner's Guide

3D Graphics with XNA Game Studio 4.0
3D Graphics with XNA Game Studio 4.0

Papervision3D Essentials
Papervision3D Essentials

3D Game Development with Microsoft Silverlight 3: Beginner's Guide
3D Game Development with Microsoft Silverlight 3: Beginner's Guide

Blender 3D 2.49 Incredible Machines
Blender 3D 2.49 Incredible Machines

Irrlicht 1.7.1 Realtime 3D Engine Beginner's Guide: RAW
Irrlicht 1.7.1 Realtime 3D Engine Beginner's Guide: RAW

Away3D 3.6 Essentials
Away3D 3.6 Essentials


No votes yet

Post new comment

CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.
6
s
e
2
i
3
Enter the code without spaces and pay attention to upper/lower case.
Code Download and Errata
Packt Anytime, Anywhere
Register Books
Print Upgrades
eBook Downloads
Video Support
Contact Us
Awards Voting Nominations Previous Winners
Judges Open Source CMS Hall Of Fame CMS Most Promising Open Source Project Open Source E-Commerce Applications Open Source JavaScript Library Open Source Graphics Software
Resources
Open Source CMS Hall Of Fame CMS Most Promising Open Source Project Open Source E-Commerce Applications Open Source JavaScript Library Open Source Graphics Software