Welcome to the first chapter of the book! In the next few pages, we are going to start looking at how to set up a scene in Unreal for visualization purposes—we want to make sure that we nail this first part down before we move any further. Beginner or advanced, no matter what type of user you are, we'll need to make sure to take a look at some of the most critical elements that can make or break a scene in Unreal. Things like taking advantage of the right type of lighting, knowing where to look for the most common material parameters, or learning to measure the impact in performance that the shaders have are vital in any project. With that in mind, we are going to be learning about the following topics:
- Setting up a studio scene
- Working inside the material editor
- Our first physically based material
- Creating some simple glass with the translucent blend mode
- Lighting our scene with image-based lighting
- Checking the cost of our materials
Welcome to this in-depth journey through the material creation process in Unreal Engine 4! I think you are going to have a great time if you are excited about the possibilities that this game engine brings to the table in terms of state-of-the-art rendering techniques. And by state-of-the-art I mean a powerful and robust rendering pipeline, where both photorealistic and stylized game art are possible without changing to a different development suite.
The fact that such a flexible system is in place is courtesy of the continuous advances over the years in the field of real-time rendering. We've journeyed from the 2D era into the 3D era, from sprites and flat images to the rendering of polygons and whole worlds.
Each of these changes happened thanks to a combination of new and more powerful hardware as well as increasingly intelligent rendering pipelines and techniques. One of the latest improvements that we can talk about is what we are going to be covering throughout this book—the PBR workflow.
And what does PBR stand for? That would be Physically Based Rendering—a particular method that takes into account how light behaves when it comes into contact with 3D objects. In order to represent materials placed in a 3D environment, artists need to specify certain properties for each of the materials that they create—such as what the underlying color should be, how much light they reflect, or how defined those reflections are.
This is significant change from previous workflows, where light propagation and its simulation wasn't taken into account in a realistic way. This meant, for example, that materials couldn't be replicated under different lighting conditions—having, for instance, a night and a day scene using the same assets resulted in them looking substantially different. An artist would therefore need to create different sets of textures or adjust the materials to make them look right for each particular scenario they might be in.
This has changed with the recent introduction of the PBR workflow. Newer game engines, such as Unreal Engine 4, have made this rendering approach their quasi default one—and I say quasi as they also allow for older rendering methods to be thrown into the mix in order to give artists more freedom. Materials are coherent under different lighting settings, and knowing how to create content under this pipeline ensures usability under a lot of different circumstances.
However, PBR is not a universally defined convention as far as its implementation goes. This means that how things work under the hood varies across the different rendering engines. The exact implementation that Epic has chosen for their Unreal Engine platform is different from that of other third-party software creators. Furthermore, PBR workflows in real-time applications are slightly different to offline renderers, as efficiency and speed are a must in this industry and things have to be adapted consequently. What we need to take away from these facts is that a physically based approach to rendering has huge advantages (as well as some limitations) that we as artists need to be aware of if we are to use the engine to its full potential.
We conceived the present book with that goal in mind. We aim to present you with a series of recipes that tackle many different functionalities within Unreal, structured in a way where each unit can be read independently from the rest. In order to do so, we'll be taking a look in the following pages at how to get a hold of the engine and how to set up a basic scene, which we'll use to visualize our projects.
In this first recipe, we are going to create a basic scene that we'll be able to use as our background level throughout this course. This initial step is here just so we can go over the basics of the engine and get familiar with different useful websites from where we can download multiple assets.
Before we actually start creating our basic studio scene, we will need to download Unreal Engine 4. I've started writing this book with version 4.20.3, but don't hesitate to use the latest version at the time of reading.
Here's how you can download it:
- Get the Epic Games Launcher from the engine's website, https://www.unrealengine.com/en-US/blog, and follow the installation procedure indicated there.
- Once installed, download the latest version of the engine. We can do so by navigating to the
UNREAL ENGINEsection of the launcher, in the tab named
Library. In there, we'll be able to see a
1), which lets us download whichever version of Unreal we want. Once we've downloaded it, launch it (
2) so we can get started:
And that's all you need! We now have everything required to get started in Unreal Engine 4. How cool is that? A whole new game engine at our fingertips, completely free, and with a variety of tools within it that would take years to learn and master. It really is a thing of wonder! Next up, we are going to start learning about one of those tools—the materials. And in order to do so, let's start by creating our first project!
- Create a
New Project—give it a name and select the folder where you want it to live. Just as a reference, as shown in the following screenshot, I've decided to start off with a blank blueprint-based project, but it doesn't really matter what we decide to initially include. Nothing special so far! You can choose to add the
Starter Contentif you want, as it comes with several useful resources that we can use later on:
Additionally, you can get more free resources from other different places. You can check the
Learn tab within the
Epic Games Launcher to see what freely available examples you can get a hold of, or check the community section to see if there is any new cool content.
Epic has recently collaborated with multiple content creators to make a multitude of different assets available to anyone using Unreal, and you can check them out at the following website: https://www.unrealengine.com/en-US/blog/new-free-content-coming-to-the-unreal-engine-marketplace?utm_source=launcher&utm_medium=chromium&utm_term=forum&utm_content=FreeContent&utm_campaign=communitytab.
- The first thing that we need to do once the editor loads is to go to
Save Current As, just to make sure that the changes we are about to implement get saved. Otherwise, we would just be working on the default untitled map, which wouldn't store any of the changes that we are about to make!
- Once that's done, we are now ready to start spicing things up. Erase everything from the world outliner—we are not going to be using any of that for our studio scene. Your scene and the world outliner should look something like this:
- If you haven't done so before, it is now time to include the
Starter Content. Don't worry if you didn't do it at first! I didn't say it was mandatory only to be able to look at how to include it after starting a new project—just navigate to the content browser and look for the
Add Newoption in the upper left corner. Select the first available option in there, named
or Content Pack,
- With that included, we can see that the Starter Content includes a blueprint that can be quite useful for setting up the lighting in our scene. You can look for this inside of the
Blueprintsfolder, and it's named
BP_ Light Studio. Select it and drag it into the scene we have previously created.
The asset called
BP_Light Studio is a blueprint that Epic Games has already created for us. It includes several lighting settings that will make our lives easier—instead of having to set up multiple lights and assign them different values, it automates all of that work for us so we just have to choose how we want our scene to look. Making a simple studio scene will be something very easy to do this way.
Retaining that level of control over which lights are placed and how we do that is, of course, very important, and something that we'll do later in the book, but for now this is a very powerful tool that we will use.
- With the
BP_ Light Studioplaced in our scene, we can start tweaking its default values just so we can use it as a lighting studio setup. Select the blueprint from the world outliner and let's tweak several settings.
- The first one we can look at is the
HDRitab inside the details panel for the
BP_ Light Studio. HDRi is short for High Dynamic Range imaging, which is a type of texture that stores the lighting information from the place at which the photo was taken. Using that data as a type of light in 3D scenes is a very powerful technique, which makes our environments look more natural and real:
- However, useful HDRi might be, this lighting method is turned off by default, so make sure to tick the
Use HDRicheckbox. That will make the texture placed in the
HDRi Cubemapslot light the scene. Feel free to use any other ones you might have or download one to use throughout the project!
HDRi images are very useful for 3D artists, even though they can be tricky to create as it is usually a lengthy process. There are many websites from which you can buy them, but I like the following one that gives you free access to some very useful ones: http://www.hdrlabs.com/sibl/archive.html.
We will be using the one called Alexs Apartment, which is quite useful for interior visualization.
- You can now untick the
Use Light Sunand the
Use Atmosphereoption found under the
Atmospheresection of the
BP_LightStudioblueprint if you use an HDRi image. As we said earlier, this type of picture stores lighting information, which renders the use of other lights sometimes optional.
- Once you've done that, let's create a basic plane on which we can use to lay out our objects. Dragging a plane into the scene from the
Modespanel will do the job:
- Let's assign our newly placed plane an interesting default material so we have something to look at—with the plane selected, scroll down to the
Materialssection of the details panel and change its default value to
M_Wood_Pine. Said material is part of the
Starter Content, so make sure you have it installed!
We should now be looking at something like the following:
With that out of the way, we can say that we've finished creating our basic studio scene. Having done that will enable us to use this level for visualization purposes, kind of like having a white canvas on which to paint. We will use this to place other models and materials as we create them, in order to correctly visualize our assets.
There are at least two different objectives that we can complete if we follow the previous set of steps—the creation of our intro scene being the first one and the second one being getting familiar with the engine. This final task is something that will continue to happen over time—but getting our hands dirty now will have hopefully accelerated that process.
Something that could also speed that up even more is a review process of what we've just done. Not only will we learn things potentially faster, but knowing why we do the things the way we do them will help us cement the knowledge we acquire—so expect to see a How it works... section after each recipe we tackle! As the first ever example of the aforementioned section, we'll briefly go over what we have just done before in order to understand how things work in Unreal.
The first step we've taken was to actually create the Unreal Engine project on which we'll be working throughout this book. We've then added the assets present in the Starter Content package that Epic Games supplies, as it contains useful 3D models and materials that we can check later on as we work on other recipes. The most important bit we've done was probably the lighting setup though, as this will be the basis of some of the next recipes. This is because having a light source is vital to visualizing the different assets that we create or add to the scene. Lighting is something that we'll explore more in some of the next recipes, but the method we've chosen in this one is a very cool technique that you can use in your own projects. We are using an asset that Unreal calls a blueprint, something that allows you to use the engine's visual scripting language to create different functionalities within the game engine without using C++ code. This is extremely useful, as you can program different behaviors across multiple types of actors to use to your advantage—turning a light on and off, opening a door, creating triggers to fire certain events, and so on. We'll explore them more as we go along, but at the moment we are just using an already available one to specify the lighting effects we want to have in our scene. This is in itself a good example of what a blueprint can do, as it allows us to set up multiple different components without having to specify each one of them individually—such as the HDRi image, the sun position, and others that you can see if you look at the
Let's get started with the material editor! This is the place where the magic will happen and also where we'll spend most of our time during this cookbook. Better get well acquainted with it then! As with everything inside Unreal, you'll be able to see that this space for creating materials is a very flexible one—full of customizable panels, rearrangeable windows, and expandable areas. You can place them however you want!
Because of its modular nature, some of the initial questions we need to tackle are the following ones: how do we start creating materials and where do we look for the most commonly used parameters? Having different panels means having to look for different functionalities in each of them, so we'll need to know how to find our way around the editor. We won't stop there though—the editor is packed with plenty of useful little tools that will make our jobs as material creators that much easier, and knowing where they live is one of the first mandatory steps.
So, without further ado, let's use the project we have already set up in the previous recipe as our starting point and let's start creating our first material!
There's not much we need to do at this point—all thanks to having previously created the basic blank project. That's the reason we created it in the first place, so we can start working on our materials straight away. Having set up the studio scene is all we need at this point.
In spite of this, don't feel obliged to use the level we created in the first recipe. Any other one will do, as long as there are some lights in it that help you visualize your world. That's the advantage of the PBR workflow, that whatever we create following its principles will work across different lighting scenarios. Let's jump right in!
It's now time to take a look at how the material editor works, at the same time as we create our first material. This editor includes many different tools and functionalities within it, so there are plenty of things to take a look at!
Remember that you can bring the material editor up by just creating a new material and double-clicking on it.
The first important thing we will be doing is to actually create a material. Of course, this is a very trivial action and there's not much to explain—just right-click anywhere on the content browser and select the
Create Basic Asset |
Material option. What is important is knowing how to name and organize our contents. Even though keeping the
Content Browser organized is not the main goal of this chapter, I didn't want to pass up on the opportunity to briefly talk about that.
One good way of keeping things tidy is to organize the folder structure in categories (
Environment...) and naming the different assets using Unreal's recommended syntax. You can find more about that on several discussion forums or on Epic Games' wiki:
- Unreal Engine 4 style guide: https://github.com/Allar/ue4-style-guide
- Assets naming convention: https://wiki.unrealengine.com/Assets_Naming_Convention
The second important thing we want to be doing is to make sure that the layout we are looking at is the default one, just so that the images we will be including later on match what you'll be seeing in your monitor. To do that, go to
Reset Layout, as shown in the following screenshot:
Remember that resetting the layout to its default state can still make things not look perfectly equal between your screen and mine—that's because settings such as the screen resolution or its aspect ratio can hide panels or make them imperceptibly small. Feel free to move things around until you reach a layout that works for you!
Now that we've made sure that we are looking at the same screen, let's turn our attention to the material editor itself and the different parts that constitute it. By default, this is what we should be looking at:
- The first part of the material editor is the
Toolbar, a common section that you'll find in many other places within the engine. It lets you save your progress or apply any changes that you've made to your materials amongst other things.
- The second panel is the
Viewport, where we'll be able to see what our material looks like. You can rotate the view, zoom in or out, and change the lighting setup of that window.
3) is a very useful one, for here is where we can start to define the properties of the materials that we want to create. Its contents vary depending on what is selected in the main graph editor (the panel numbered
Find Resultspanels (
4) is where you can take a look at how costly your materials are or how many textures they are using.
5) is a library of different nodes and functions that we'll use to modify the materials we create.
main graph editor(
6) is where the action happens, and where most of the functionality that you want to include in your materials needs to be visually scripted.
Now that we've taken a look at the different parts that make up the material editor in Unreal, we can start creating our own first simple material—a plastic. I find plastics to be a very straightforward type—even though we could make them as complicated as we want to. So, let's explore how we would go about at creating it:
- Take a look at the main graph. By default, every time you create a new material, you should be looking at a central main node. You will see multiple pins, which are the elements where we want to connect the different elements we will be creating.
- Right-click on the main graph, preferably to the left of the main material node, and start typing
constant. As you start to write, notice how the auto-completion system starts to show several options:
Constant3Vector,and so on. Select
Constant3Vector, as shown in the following screenshot:
- Having chosen that option, you will be able to see that a new node has now appeared. You can now connect it to the
Base Colorof the material node. If you are on the constant node, take a look at the
Detailspanel and you'll be able to see that there are a couple settings that you can tweak. Since we want to move away from the default blackish appearance that the material now has, click on the black rectangle to the right of where it says
onstantand use the color wheel to change its current value. I'm going to go with orange:
There's more to the base color property than meets the eye! Apart from the different options that are available to select a color, you might be interested to know that the actual value that gets connected to the material slot matters beyond the color choice. Certain materials have a measured intensity to them, and you can check that out on the following website: https://docs.unrealengine.com/en-us/Engine/Rendering/Materials/PhysicallyBased.
At the moment, we can see that we have managed to modify the color of our material. We can now change how sharp the reflections are, as we want to go for a plastic look. In order to do so, we need to modify the
Roughness parameter with another different constant. Instead of right-clicking and typing, let's choose it from the palette menu instead.
- Navigate to the
Palettesection, and look for the
Constantcategory. We want to select the first option in there, aptly named like this subsection itself. Alternatively, you can type its name in the search box at the top of the panel:
- A new, smaller node should have now appeared. Unlike the previous one, we don't have the option to select a color—we need to type in a value. Let's go with something low, about
0.2. Connect it to the
If you look at the preview viewport, you will notice that the appearance of the material has now changed. It looks like the reflections from the environment are much sharper than before. This is happening thanks to the previously created constant pin, which, using a value closer to 0 (or black), makes the reflections stand out that much more. Whiter values decrease the sharpness of those reflections or, in other words, make the surface appear much more rough.
Having done so, we are now in a position where we can finally apply this material to a model inside of our scene. Let's go back to the main level and look at the
Modes panel, particularly to the
Basic section. Drag and drop a cube into the main level, and assign it the following values inside of the
Details panel just so we are looking at the same:
Reducing the size of the cube will make it fit better into our scene. Now head over to the
Materials section of the
Details panel, and click on the drop-down menu. Look for the newly created material and assign it to our cube. Finally, click on the
Build icon located on the toolbar as follows:
And there it is! We now have our material applied to a simple model, being displayed on the scene we had previously created. Even though this has served as a small introduction to a much bigger world, we've now gone over most of the panels and tools that we'll be using in the material editor. See you in the next recipe!
We've used the present recipe to learn about the material editor and we've also created our first material. Knowing what each section does within the editor will help a lot in the immediate future, as what we've just done is but a prelude to our real target—creating a physically based material. Now we are in a much better position to tackle that goal, so let's look at it in the next recipe!
Before moving on though, let's check the nodes that we have used to create this simple material. From an artist's point of view, the names that the engine has given to something like a color value or a grayscale value can seem a bit confusing. It might be difficult to establish a connection between the name of the
Constant3Vector node and our idea of a color. But there is a reason for all of this!
The idea behind that naming convention is that these nodes can be used beyond the color values we have just assigned them. At the end of the day, a simple constant can be used in many different scenarios—such as depicting a grayscale value, using it as a brightness multiplier, or as a parameter inside a material function. Don't worry if you haven't seen these other uses yet, we will—the point is, the names that these nodes were given tell us that there are more uses beyond the ones we've seen.
With that in mind, it might be better to think of those elements we've been using in more mathematical terms. For instance, think of a color as an Red Green Blue (RGB) value, which is what we are defining with that previous
Constant3Vector node. If you want to use an RGB value alongside an alpha one, why not use the
Constant4Vector, which allows for a fourth input? Even though we are at a very early stage, it is always good to familiarize ourselves with the different expressions the engine uses.
PBR is, at its core, a principle that several graphic engines try to follow. Instead of being a strict set of rules that every rendering program needs to abide by, it is more of an idea—one that dictates that what we see on our screens is the result of a study on how light behaves when it interacts with certain surfaces.
As a direct consequence, the so-called PBR workflow varies from one rendering solution to the next, depending on how the creators of the software have decided to program the system. For our case, what we are going to be looking at is the implementation that Epic Games has chosen for their Unreal Engine 4 real-time renderer.
However, we are going to do so in our already established recipe process, that is, by creating real examples of materials that follow the PBR workflow rather than just talking in a general way. Let's get to it!
We don't need a lot in order to start working on this recipe—just the project we have previously created so we don't have to start from scratch. You can continue using the previous section's materials or create new ones, whatever works best for you! Something that would be helpful to have is the scene from the previous recipe open, for instance—that way we already have a 3D model in it that we can use to show our materials on.
- Right-click anywhere inside of the
Content Browserand select the material option in the
Create Basic Assetsection. Name it whatever you want—I'll go with
M_PBR_Metalfor this particular instance. Double-click on the newly created material to open up the material editor.
- With the
Materialeditor now open, we can start taking a look at the PBR workflow. The first material we are going to create is a metallic one, a particular type that uses most of the attributes associated to this pipeline. With that said, let's focus our attention on the following two different places—the
Detailspanel and the main
The settings you see here are the default ones for most materials in Unreal, and they follow the PBR pipeline very closely. The first option, the
Material Domain, is currently set to
Surface. That tells us that the material we are creating is meant to be used on a 3D model.
Blend Mode, which has a value of
Opaque, indicates that it is not a translucent material like glass. Finally, the shading model is set to
Default Lit, which is the default one for most materials.
This configuration is the default one for most common materials, and the one that we'll need to use to define materials such as metal, plastic, wood, or concrete, to name a few.
- With that bit of theory out of the way, let's create a
Constant3Vectornode anywhere in the graph and plug it into the
Base Colorinput pin of our material. We've used the
Base Colorattribute in the previous recipe, and as we saw, this is the node where the overall color of a material should be plugged into.
- The next item we will be creating is a
Constant. You can do so by holding the 1 key on your keyboard and clicking anywhere within the material editor graph. Give it a value of
1and plug it into the
Metallicattribute of our material.
Metallic attribute defines whether we are creating a metal or a non-metal material. We should use a value of
1 to define metallic surfaces and a value of
0 for non-metals—or we can leave this attribute unconnected, which would be the same as using a zero. Values between
1 should only be used in special circumstances, such as when dealing with metals that have been treated—corroded or painted metals and the like.
- For our next step, let's replicate what we have just done—start by creating another constant and plugging it into the
Roughnessslot. This time, let's not give it a value of
1, but something like
0.2instead. The final material graph should look something like this:
The attribute we are controlling through the previous constant defines how rough the surface of a given material should be. Higher values, such as
1, simulate the micro details that make light scatter in all directions—which means we are looking at a matte surface where reflections are not clear. Values closer to zero result in those imperfections being removed, allowing a clear reflection of the incoming light rays and a much clearer reflected image.
Through the previous steps, we have taken a look at some of the most important material attributes used to define a PBR material. We've done so by creating a metal, which can be a good example for some of the previous properties. However, it will be good to create another quick material that is not a metallic one—this is because some of the other properties of the PBR workflow, like the specular material attribute, are meant to be used in such cases.
- Create another material, which we can name
M_PBR_Wood, and open the material editor for that asset.
- Let's plug something into the
Base Color materialattribute—but instead of using a plain value, let's go with an image this time. The
Starter Contentprovides multiple textures that can be used for this very purpose, so let's make use of one of those resources.
Right-click anywhere inside of the main graph for our newly created material and search for
TextureSample, like in the next screenshot:
- With that new node on our graph, click on it to access the options in the
Detailspanel. Click again on the drop-down menu found in the
Material Expression Texture Base|
Textureslot and type
wood. Select the
T_ Wood_ Floor_ Walnut_ Dasset and connect the
Texture Samplenode into the
Base Color materialattribute as follows:
If you want to get hold of more textures online, feel free to browse the internet for more of them. A good place where I like to search for these types of resources is www.textures.com, which allows you to download several samples a day once you create a free account.
With that done, it's time to be looking at another material attribute—the
Specular parameter. Unlike roughness, this node controls how much light is being reflected by the material and not how clear those reflections are. We therefore tend to modify the specular level when we have small-scale occlusion or small shadows happening across a surface, similar to what would be happening for the texture that we chose before.
- The seams in between the wood boards are a good place to use a specular map, as those areas will reflect less light. In Unreal, such places are described with values close to
0(black). Knowing that, drag a pin from the red channel of the previously created
Texture Samplenode into the
Specularattribute of the main material node.
You might be wondering why we are using the red channel of the wood texture to drive the specular parameter. The simple answer is that even though we could create a custom black and white image to achieve the same effect, any of the original textures' channels are black and white values that contain the information that we are after. Because seams are going to contain darker pixels than other areas, the end result we achieve is still very similar if we use the red channel of the original texture. You can see in the next image our source asset and the red channel by its side:
- Copy the
Texture Samplenode twice, since we are going to use more textures for the roughness and the normal material attribute slots.
- Just as we did previously, select the
T_ Wood_ Floor_ Walnut_ Mand the
T_ Wood_ Floor_ Walnut_ Nassets on each of the new nodes. Connect the first one to the
Roughnessslot and the second one to the
Normalnode. Save the material and click on the button that says
Apply. Your material node graph should look something like this:
- Navigate back to the main level, and select the floor plane. In the
Detailspanel, scroll down to the
Materialssection and assign the
M_PBR_Woodmaterial we have just created. Take a look at what our scene looks like now:
Nice job, right? The new nodes we've used, both the specular and the normal ones, contribute to the added details we can see in the preceding screenshot. The specular node diminishes the light that is being reflected in the seams between the wood planks, and the normal map modifies the direction in which the light bounces from the surface. The combined effect is that our model, a flat plane, looks as if it has much more geometrical detail than it really has.
Remember how we were talking about each renderer having its own implementation of a PBR workflow? Well, we have just taken a look at how Epic has chosen to set up theirs!
As we have already said, efficiency and speed are at the heart of any real-time application. These are two factors that have heavily influenced the path that the engineers at Epic have chosen when coding their physical approach at rendering. That being the case, the parameters that we have tweaked are the most important ones when it comes to how Unreal deals with the interaction between light and 3D models. The base color gives us the overall appearance of the material, whilst roughness indicates how sharp or blurry the reflections are. Metallic enables us to specify whether an object is made out of metal, and the specular node lets us influence how intense those reflections are. Finally, using normal maps allows for the modification of the direction in which the light gets reflected—a useful technique for adding details without actually using more polygons.
The previous parameters are quite common in real-time renderers, but not every program uses the same ones. For instance, offline suites such as VRay use other types of calculations to generate the final output—physically based in their nature, but using other techniques. This shows us that, at the end of the day, the PBR workflow that Epic uses is specific to the engine and we need to be aware of its possibilities and the limitations.
Throughout the current recipe, we have managed to take a look at some of the most important nodes that affect how the physically based rendering gets tackled in Unreal Engine 4. Base color, roughness, specularity, ambient occlusion, normal maps, and the metallic attribute all constitute the basics of the PBR workflow.
Having seen all of them, we are now ready to start looking into how to build more complex materials and effects. And even though we still need to understand some of the other areas that affect our pipeline, we can do so with the certainty that the basics are covered.
In the previous section, we had the opportunity to create a basic material that followed the physically based approach that Unreal Engine uses to render elements into our screens. By using nodes and expressions that affected the roughness or the metallic attributes of a material, we saw how we could potentially create endless combinations—going from plastics to concrete, metal, or wood.
Those previous examples can be considered simple ones—for they use the same shading model to calculate how each element needs to be rendered. Most of the materials that we experience in our daily lives fall into that category, and they can be described using the attributes we have previously tweaked. In spite of that, there are always examples that can't be exactly covered with one unique shading model. The way that light behaves when it touches glass, for example, needs to be redefined in those cases. The same applies to other elements, such as human skin or foliage, where light distribution varies from that of a wooden material.
With that in mind, we are going to create several small examples of materials that deviate from the standard shading model—starting with some simple glass. This will work as an introductory level, just so we can create more complex examples at a later stage. Buckle up and let's dive right in!
In order to start this recipe, you are not going to need a lot of anything. The sample Unreal project we have previously created will serve us fine, but feel free to create a new one if you are starting in this section of the book. It is completely fine to use standard assets, such as the ones included with the engine, but I've also prepared a few of them that you can download if you want to closely follow this book.
The first example that we are going to create is going to be some simple glass. As before, right-click in the appropriate subfolder of your
Content Browser and create a new material. Here's how we go about it:
- Let's name it with a pertinent name, something like
M_SampleGlass, as that's what we'll be creating!
- Open up the material editor, and focus on the D
etailspanel. That's the first area we are going to operate on. Make sure you have the main material node selected—if you haven't created anything else, that's the only element that should exist on the main editor graph:
- Having the main node selected, you'll be able to see that the second editable attribute under the
Materialsection of the
Detailspanel is the
Blend Mode. Let's change that from the default value of
Opaqueto the more appropriate
Translucentone as follows:
- After this change has happened, you'll note that several options have been grayed out inside of the main material node. We'll come back to this shortly.
- Without leaving the
Detailspanel, you can now scroll down to the
Translucencysection of the main material node. You should be able to find a drop-down menu named
Lighting Mode, which we'll need to change from the default value of
Volumetric NonDirectionalto the one named
Surface Translucency Volume, as shown in the following screenshot:
If you hover over each of the options inside of the
Lighting Mode drop-down menu, you should be able to take a look at their description. You'll note that some of the options are meant to be used with particles, while others are meant for 3D models. That's the reason why some of the material attributes were previously grayed out— some options don't make sense to be used if we are going to be applying the material to a particle, for example, so these are left out.
Why a Constant4Vector and not a Constant3Vector, as we used last time? This new type that we are using includes a fourth parameter, which can be used as an alpha value, something very useful for glass-like materials as you'll see for yourself in a moment.
- Without leaving the Constant4Vector behind, set the alpha value to something like
0.5. Don't go all the way with this parameter! Setting it either as a
1would make our future material fully transparent or opaque, so choose something in between. Plug the value into the
Base Colormaterial node as follows:
- Now it's time to plug in the alpha value of our
Opacityslot of our material. Drag from the pin of the
Constant4Vectorinto an empty space in the main graph and release the left mouse button. A contextual menu should now appear, and you want to type
ComponentMaskis what we want to be doing now!
- With the component mask selected, let's take a look at the details panel. In there you'll be able to select which of the four components from the Constant4Vector node you want to use. For our case, as we'll be driving the opacity through the alpha, let's just tick the last option.
- Finally, connect the mask to the
Opacitypin. Click on the
Applybutton and save the material. The preview window may take a moment to update itself, but once it does we should be looking at a translucent material like the following:
Now that we have our material correctly set up, let's apply it to the model in our scene. If you've opened the level that I've set up for you,
01_ 04_ TranslucentMaterials_ Intro, you'll see that we have an object called
SM_ Glass. If you are creating things on your own project, just create a model in which we can apply this newly created material. In any case, the scene should look something like this after you apply the new material:
Simple but effective! In the future, we'll be taking a look at how to properly set up a more complex translucent material, with reflections, refractions, and other interesting effects. But for now, we've taken one of the most important steps in that path—actually starting to walk!
Translucent materials are really tricky to tackle in real-time renderers—and we are starting to see why. One hint that you might have been able to spot is that we aren't using a different shading model to create glasses. Instead, we are just using a different blend mode. So what are the differences between both of these concepts, and how is driving translucent materials through the latter indicative of their render complexity?
First of all, a shading model is a combination of mathematical expressions and logic that determines how models are shaded or painted with light. One such model will describe how light behaves when it comes into contact with a material that uses said shading method. We use as many different models as we need in order to describe the different materials we see on our daily lives—for example, the way light scatters through our skin or the way it does the same on a wooden surface. We need to be able to describe that situation in a way that our computer programs can tackle that problem.
With that in mind, you could think that we should have a different shading model to describe translucent materials. However, things are a bit more complex in real-time renderers as the calculations that we would need to have to realistically simulate that model are too expensive performance-wise. Being always on the lookout for efficiency and speed, the way that Unreal has decided to tackle this issue is by creating a different blend mode. But what is that?
You can think of blend modes as the way that the renderer combines the material that we have applied to a model on the foreground over what is happening on the background. Up until now, we've seen two different types— opaque and the translucent ones.
The opaque blend mode is the easiest one to comprehend: having an object in front of another will hide the second one. This is what happens with opaque materials in real life— wood, concrete, bricks, and so on. The translucent mode, however, lets the previously hidden object to be partially visible according to the opacity value that we feed into the appropriate slot.
This is a neat way of implementing translucency, but there are some caveats that the system introduces we have to be aware of. One such issue is that this blend mode doesn't support specularity, meaning that seeing reflections across the surface is a tricky effect that we will have to overcome later on. But don't worry, we'll definitely get there!
This introductory chapter has so far laid out some of the foundations of the PBR workflow that Unreal introduces. With that pipeline as our main focus, we've already taken a look at several of its key components—namely the different material parameters and shading models.
However, as we've said in the past, PBR takes information from the lights in our scene to display and calculate how everything should look. So far, we've focused on the objects and materials that are being rendered, but that is only part of the equation. One of the other parts is, of course, the light emitters themselves.
Lights are crucial to the PBR workflow. They introduce shadows, reflections, and other subtleties that affect how the final image looks. They work alongside the materials that we've previously applied by giving value to some of the properties we set up. Roughness textures and normal maps work in tandem with the lights and the environment itself. And all of this combined is also an integral part of the pipeline we are looking at in this introductory chapter.
With that as our objective, let's create in this recipe different types of lights and see how they affect some of the materials we have previously created. We'll be taking a look at the all-important High-dynamic-range imaging (HDRi) maps, 32-bit textures, which include lighting information in them and that can be used to light up a scene. Let's get started!
You can use the scene we created at the beginning of the book, where we set up a studio environment. We took some time aside in the introduction to this book to set it up just so we could place several objects and visualize them. At that point, we just wanted to create something quick and useful, and one of the things we did was to use one of the already available resources of the Starter Content: the
BP_Light Studio blueprint. Through that, we've already had access to HDRi lighting, the topic that we are going to be covering in this recipe.
With that in mind, we are now going to explore how to use this type of lighting to its full potential and create a realistic scene through it.
We will start this recipe by placing a reflective object in our default scene and looking at how certain aspects of the environment can be seen reflected in its surface. Take the following steps:
You can see that I've applied a material to the model, named
M_Chrome. This is a copy of the material we created in our third recipe, named
M_PBR_Metal, where we've modified the base color and the roughness value to make it more chrome-looking. Thanks to its reflective properties, we can see the environment clearly. This is happening thanks to the HDRi image we are using. We are now going to replicate this effect without using the blueprint that was already set up for us, and we will instead create our own.
One of the things that we want to move away from in the setup we are going to create is having the environment image visible at all times. You could be thinking that the metal ball is reflecting the image you see in the preceding screenshot and not the actual light—and that would be only natural as you are seeing that image in the background. This is, however, just a visual cue that the blueprint uses to better visualize from where the environment lighting is hitting an object. Having said so, let's start working with the basic building blocks and not with pre-made tools to better understand how things work.
SphereReflectionCaptureand click on the
Buildicon—we should now be looking at a completely dark scene.
- From the
Modespanel, navigate to the
Placetab and to the
Lightssection within it. You should be able to find a
Skylight, the type of light that we can use to illuminate with HDRi textures. Drag and drop it into the scene as follows:
- With the newly created skylight selected, navigate to the D
etailspanel and look at the options under the
Lightsection. The first option on the drop-down menu says SLS Captured Scene, which uses the already existing scene to create a light. We want to change that value to the other available option,
SLS Specified Cubemap. Once that's done, select a
Cubemapfrom the next drop-down menu—let's go with the one we've used in the past,
HDRI_AlexsApt, as follows:
- After selecting the texture, you will be able to check for yourself that nothing has changed; we are still looking at a black screen. This is because the default type that was spawned was one of the
Statictype and skylights of that type need to get built before we can see them. Click on the
Buildicon again and see what happens:
As you can see, going from a static type of light to a dynamic one gets us the reflections back. This is due to the fact that static lights only exist during the light baking process—that is, when we click on the
Build button. In order to use HDRis to their full extent, we should be aiming for a dynamic or stationary type of light.
Let's focus once again on the metallic ball under this new dynamic skylight that we now have. There might be a bit of a problem, if we look closely:
You can see that there's a black edge going on across the surface of the ball, which is happening because the skylight is by default only using half of a sphere to project the selected texture. This is happening because objects are usually not lit from underneath, and we might be fine with that sometimes. However we can solve that by selecting the next option:
You might be inclined to fill your scene with geometry so as to obscure the emptiness that is being reflected in the chrome ball. However, Unreal doesn't render the objects that the camera can't see—so the reflections that should be happening thanks to the objects that would be behind it won't show at all. This is one of the sacrifices that real-time rendering has to make in order to be so efficient, so keep that in mind! We can solve that by placing a reflection capture, as we'll see next.
However useful having a full spherical HDRi skylight lighting our scene might be, it can also introduce some undesired effects that we don't want to see. For instance, we might want to use the actual geometry of our level to affect the lower part of the chrome ball and not the HDRi. If that's the case, tick again the
Lower Hemisphere Is Solid setting and let's try something different.
- Place some planes around the level, in a similar fashion to what I'm doing in the next screenshot. This is just to mimic a scenario where we would have more geometry throughout the level, which could be used for reflections, so we don't have that black band across the reflective ball we saw before. Assign those planes a different material—I'm using
M_Basic_Wallfrom the Starter Content pack:
- In order to fix this, go to the drop-down menu to the right of the
Buildicon and select the option
Build Reflection Captures:
Throughout the current recipe, we've had the opportunity to work with HDRi lighting. The lights that make use of this technique are usually of the
Skylight type in Unreal Engine 4, a particular kind that allows for the input of the necessary textures that contain the photon information.
As we've said before, HDRi images capture the lighting state of a particular scene in order to be able to use that information in a 3D environment. The way they do this is by sampling the same environment multiple times under different exposure settings. Taking multiple pictures this way allows for their combination at a post process stage, where the differences in lighting can be interpolated to better understand how the scene is being lit.
What's important to us it that we need to be on the lookout for the right type of textures. HDRi images need to be in a 32-bit format, such as
.HDRi, as each pixel contains multiple layers of information condensed into itself. You might find HDRi images in a non-32-bit format, but these don't contain as much lighting information as the real ones because of the format they use.
Another parameter to take into consideration is the number of f-stops that a given HDRi image is composed of. This number indicates the amount of different pictures that were taken under different exposures to be able to compose the HDRi. A value of five means that the HDRi was created out of five interpolated images, and a value of seven indicates that said number was instead used. More pictures mean a wider range of values and the consequent increase of information. It is a case of the more, the better, as seen in the next screenshot:
These photographs are a sequence of different images that make up an HDRi. HDRi by HDRi labs.
In this recipe, we've taken a look at several key concepts in the PBR workflow—image-based lighting, reflections, and the different mobility types a light can belong to. These elements, while not a part of the material pipeline themselves, are an essential part of the whole physically based approach at rendering that Unreal has at its core. They work hand-in-hand with the materials we create, expanding their capabilities and complementing the base properties we define them to have. Think about it—there's not much use having a highly reflective material if we don't tell the engine how to treat those reflections. Hope you found this useful!
In this recipe, we are going to be looking at the impact that our materials have on performance. So far, this introductory chapter has gone over the basics of the rendering pipeline—we've seen how to create a physically based material, understood what the different shading models were, and saw how light played a key role in the overall look of the final image. However, we can't yet move on without understanding the impact that our games or applications have on the machines that are running them.
The first thing that we need to be aware of is that some materials or effects are more expensive in terms of efficiency than others. Chances are you have already experienced that in the past—think, for example, about frames per second in video games. How many times a second our displays are updated by the hardware that runs them directly influences how the game plays and feels. There are many elements that affect performance, but one determining factor in that equation is how complex our materials are.A different example, if your background is more closely tied to traditional offline renderers such as VRay or Corona, could be how the rendering times vary wildly depending on how complex the materials you are rendering are. Using subsurface scattering, complex translucency, or a combination of multiple advanced effects can take render times from minutes to hours.
The point is that we need to be able to control how performant the scene we are creating is. Unreal offers us several tools that allow us to see how expensive certain effects and materials are, and check where we should be optimizing our assets or where certain things aren't working. With that in mind, let's bring all of the assets we have previously created together and use those tools to check them out.
All we need to do before starting this recipe is to load up the map called
01_ 06_ The Cost Of Materials. As you can see, it's just the usual scene we have been working with up until now, except that it now has a couple more models in it. Feel free to bring your own meshes and materials, as we are going to be checking them out from a technical point of view. All we care about at this point is having multiple elements that we can take a look at, so having materials that use different blend modes is great in that we will be able to see the difference in performance between them.
No matter if you've opened the level provided with this book or one of your own, we are going to be looking at the rendering cost that materials incur when being displayed. To do so, we'll be taking a look at several different indicators that can help us understand our scenes a little bit better. Take the following steps:
- Let's start by taking a look at the following scene:
I've included four different objects with their respective materials applied, which should help us understand the cost to performance that each one of them has.
- Continue by selecting the chrome ball (named
SM_ReflectiveSphere) and navigate to the
Detailspanel, specifically to the
Materialstab. Double-click on the material that is currently applied to the model to open the material editor.
- With that editor in front of us, let's take a look at the
The values we see in there can give us an approximation to how expensive the material is to render. You can see that this
M_ReflectiveSphere has 115 instructions for the base pass shader, 135 if we are using static lighting, and 191 if we use movable lights. The numbers themselves will be useful if we compare them to other materials.
- Let's go back to the main scene, and select the object named
SM_Glass. Open up the material that it has applied just like we did for the reflective ball, and look at the stats panel again:
As you can see, the instruction count is much higher than in the last example we saw. This is due to the fact that the complexity of translucent materials is higher than that of opaque ones, and we can see that in here.
- After clicking that button, you should be looking at something like this:
This is a more visually appealing way of looking at the shader complexity. However, it is one that is not 100% accurate, as Unreal only takes the instruction count as a reference to calculate the gradient you are seeing in the preceding image and not the complexity of the material's nodes themselves.
You might see similar values for two different materials that are really not equal in terms of their complexity—for instance, a material that is made out of several textures versus one that uses simple constants might show a similar complexity in this viewmode when in reality using the first is more demanding on the Graphics Processing Unit (GPU).
Now that we've seen one of the optimization viewmodes, why stop with just that one? All of them are useful for understanding how our scene is working from different technical points of view. Let's go over them in a quick way to see how they can help us.
- The first of these modes is called Light Complexity. This can serve us to analyze how expensive the different lights we have in our scene are. Toggle that on and let's see how our scene looks.
- At first you'll see that the whole scene is being rendered in black. This is because we are using a HDRi static type of light—and as the lighting has already been calculated, there's no light complexity at this stage at all. You can only see the object's I've selected being outlined in yellow for reference purposes:
- If we place a new point light, you'll be able to see how the scene turns blue. This is to indicate that there's some complexity to the scene, but this is just as cheap a lighting method as they come:
- Placing more lights will change how your scene looks—getting away from its original blue color. That means that lighting is becoming more complex and costly for our hardware to compute, so keep that in mind! The following screenshot is what our scene looks like in that viewmode with seven different lights:
- Another viewmode related to the previous one is Stationary Light Overlap. If we have multiple stationary lights it will tell us how expensive our scene is to render, in a gradient that goes from green to white.
- Finally, the lightmap density viewmode shows you how dense the lightmaps are for the objects that occupy your scene. Using static or stationary lights means that static objects will have their shadows baked, and this is the viewmode that lets us see if the settings we've chosen for our models are evenly distributed. Let's take a look at the two following examples.
- In this first set of images, we've set up the lightmap resolution for both wood planes to a high value of 1,024. That means, as you can see in the first image, that the shadows look correct even across the two surfaces:
- In the next set of images, we've lowered the resolution, but in a more dramatic way for the vertical wood plane. That means that the vertical plane has much lower quality shadows to the point where they are barely visible, even though they are still there in the horizontal plane:
There are a couple more viewmodes that we haven't talked about, but they deal with the amount of polygons that a model comprises and are not related to the materials we are using. You can take a look at them in the same panel we saw before, and they are called Quad Overdraw and Shader Complexity and Quads. They can be very useful in order to diagnose our scenes, especially when we have many high poly meshes or semi-transparent models—so keep them on your radar in case you ever need them!
As we've seen in previous recipes, materials are not homogeneous entities. And we are not even talking about the ones in real life, but, of course, the ones we have created within Unreal. The mathematics and functions used to describe the different shading and blend modes carry a weight with them that vary from one type to the next. Knowing how heavy each of them is can be a complicated task to burden oneself with, but having an overall idea is key to running a well-oiled application.
In the previous pages, we've taken a look at some examples, which included an opaque material and a translucent one—examples that we've worked on in the past. However, we need to keep in mind that there are more types we can—and will—encounter in the future. Unreal includes the following different shading models, which I will list now in order of how costly they are to render:
- Default lit
- Preintegrated skin
- Clear coat
- Subsurface profile
Of course, the actual cost of a material depends on how complex we make the graphs for each of them, but that previous order applies to base materials with nothing more applied to them. On top of that, there are options within each type that can make them more or less expensive to render: having a material being two-sided or use a particular type of translucency can increase the cost to the GPU, for example.
On top of this, there are other things to be considered in terms of efficiency that we might want to keep in mind. Epic has created some performance guidelines for artists that highlight where we should be focusing our attention in order to keep our applications running well. You can take a look at them at the following link: https://docs.unrealengine.com/en-us/Engine/Performance/Guidelines.
We've used this recipe to take a look at how fast Unreal can process different types of shaders. We've done so by comparing an opaque material against a translucent one, which gives us a good idea about how instruction counts vary and how efficient some shaders are compared to others. Not only that, we've also had the opportunity to see what optimization tools are available for anyone using the engine. All in all, there is a wide variety of options that give the user control over how well their application runs, and now we are in a position in which we know how to use them.