Introduction to HLSL language

(For more resources related to this topic, see here.)

Distance/Height-based fog

Distance/Height-based fog is an approximation to the fog you would normally see outdoors. Even in the clearest of days, you should be able to see some fog far in the distance. The main benefit of adding the fog effect is that it helps the viewer estimate how far different elements in the scene are based on the amount of fog covering them. In addition to the realism this effect adds, it has the additional benefit of hiding the end of the visible range. Without fog to cover the far plane, it becomes easier to notice when far scene elements are clipped by the cameras far plane.

By tuning the height of the fog you can also add a darker atmosphere to your scene as demonstrated by the following image:

This recipe will demonstrate how distance/height-based fog can be added to our deferred directional light calculation. See the How it works… section for details about adding the effect to other elements of your rendering code.

Getting ready

We will be passing additional fog specific parameters to the directional light's pixel shader through a new constant buffer. The reason for separating the fog values into their own constant buffer is to allow the same parameters to be used by any other shader that takes fog into account. To create the new constant buffer use the following buffer descriptor:

Constant buffer descriptor parameter




















The reset of the descriptor fields should be set to zero.

All the fog calculations will be handled in the deferred directional light pixel shader.

How to do it...

Our new fog constant buffer is declared in the pixel shader as follows:

cbuffer cbFog : register( b2 )
float3 FogColor : packoffset( c0 );
float FogStartDepth : packoffset( c0.w );
float3 FogHighlightColor : packoffset( c1 );
float FogGlobalDensity : packoffset( c1.w );
float3 FogSunDir : packoffset( c2 );
FogHeightFalloff : packoffset( c2.w );

The helper function used for calculating the fog is as follows:

float3 ApplyFog(float3 originalColor, float eyePosY, float3
float pixelDist = length( eyeToPixel );
float3 eyeToPixelNorm = eyeToPixel / pixelDist;
// Find the fog staring distance to pixel distance
float fogDist = max(pixelDist - FogStartDist, 0.0);
// Distance based fog intensity
float fogHeightDensityAtViewer = exp( -FogHeightFalloff * eyePosY );
float fogDistInt = fogDist * fogHeightDensityAtViewer;
// Height based fog intensity
float eyeToPixelY = eyeToPixel.y * ( fogDist / pixelDist );
float t = FogHeightFalloff * eyeToPixelY;
const float thresholdT = 0.01;
float fogHeightInt = abs( t ) > thresholdT ?
( 1.0 - exp( -t ) ) / t : 1.0;
// Combine both factors to get the final factor
float fogFinalFactor = exp( -FogGlobalDensity * fogDistInt *
fogHeightInt );
// Find the sun highlight and use it to blend the fog color
float sunHighlightFactor = saturate(dot(eyeToPixelNorm, FogSunDir));
sunHighlightFactor = pow(sunHighlightFactor, 8.0);
float3 fogFinalColor = lerp(FogColor, FogHighlightColor,
return lerp(fogFinalColor, originalColor, fogFinalFactor);

The Applyfog function takes the color without fog along with the camera height and the vector from the camera to the pixel the color belongs to and returns the pixel color with fog. To add fog to the deferred directional light, change the directional entry point to the following code:

float4 DirLightPS(VS_OUTPUT In) : SV_TARGET
// Unpack the GBuffer
float2 uv = In.Position.xy;//In.UV.xy;
SURFACE_DATA gbd = UnpackGBuffer_Loc(int3(uv, 0));
// Convert the data into the material structure
Material mat;
MaterialFromGBuffer(gbd, mat);
// Reconstruct the world position
float2 cpPos = In.UV.xy * float2(2.0, -2.0) - float2(1.0, -1.0);
float3 position = CalcWorldPos(cpPos, gbd.LinearDepth);
// Get the AO value
float ao = AOTexture.Sample(LinearSampler, In.UV);
// Calculate the light contribution
float4 finalColor; = CalcAmbient(mat.normal, * ao; += CalcDirectional(position, mat);
finalColor.w = 1.0;
// Apply the fog to the final color
float3 eyeToPixel = position - EyePosition; = ApplyFog(, EyePosition.y,
return finalColor;

With this change, we apply the fog on top of the lit pixels color and return it to the light accumulation buffer.

How it works…

Fog is probably the first volumetric effect implemented using a programmable pixel shader as those became commonly supported by GPUs. Originally, fog was implemented in hardware (fixed pipeline) and only took distance into account. As GPUs became more powerful, the hardware distance based fog was replaced by a programmable version that also took into account things such as height and sun effect.

In reality, fog is just particles in the air that absorb and reflect light. A ray of light traveling from a position in the scene travels, the camera interacts with the fog particles, and gets changed based on those interactions. The further this ray has to travel before it reaches the camera, the larger the chance is that this ray will get either partially or fully absorbed. In addition to absorption, a ray traveling in a different direction may get reflected towards the camera and add to the intensity of the original ray. Based on the amount of particles in the air and the distance a ray has to travel, the light reaching our camera may contain more reflection and less of the original ray which leads to a homogenous color we perceive as fog.

The parameters used in the fog calculation are:

  • FogColor: The fog base color (this color's brightness should match the overall intensity so it won't get blown by the bloom)

  • FogStartDistance: The distance from the camera at which the fog starts to blend in

  • FogHighlightColor: The color used for highlighting pixels with pixel to camera vector that is close to parallel with the camera to sun vector

  • FogGlobalDensity: Density factor for the fog (the higher this is the denser the fog will be)

  • FogSunDir: Normalized sun direction

  • FogHeightFalloff: Height falloff value (the higher this value, the lower is the height at which the fog disappears will be)

When tuning the fog values, make sure the ambient colors match the fog. This type of fog is designed for outdoor environments, so you should probably disable it when lighting interiors.

You may have noticed that the fog requires the sun direction. We already store the inversed sun direction for the directional light calculation. You can remove that value from the directional light constant buffer and use the fog vector instead to avoid the duplicate values

This recipe implements the fog using the exponential function. The reason for using the exponent function is because of its asymptote on the negative side of its graph. Our fog implementation uses that asymptote to blend the fog in from the starting distances. As a reminder, the exponent function graph is as follows:

The ApplyFog function starts off by finding the distance our ray traveled in the fog (fogDepth). In order to take the fog's height into account, we also look for the lowest height between the camera and the pixel we apply the fog to which we then use to find how far our ray travels vertically inside the fog (fogHeight). Both distance values are negated and multiplied by the fog density to be used as the exponent. The reason we negate the distance values is because it's more convenient to use the negative side of the exponential functions graph which is limited to the range 0 to 1. As the function equals 1 when the exponent is 0, we have to invert the results (stored in fogFactors).

At this point we have one factor for the height which gets larger the further the ray travels vertically into the fog and a factor that gets larger the further the ray travels in the fog in any direction. By multiplying both factors with each other we get the combined fog effect on the ray: the higher the result is, the more the original ray got absorbed and light got reflected towards the camera in its direction (this is stored in fogFinalFactor).

Before we can compute the final color value, we need to find the fog's color based on the camera and sun direction. We assume that the sun intensity is high enough to get more of its light rays reflected towards the camera direction and sun direction are close to parallel. We use the dot product between the two to determine the angle and narrow the result by raising it to the power of 8 (the result is stored in sunHighlightFactor). The result is used to lerp between the fog base color and the fog color highlighted by the sun.

Finally, we use the fog factor to linearly interpolate between the input color and the fog color. The resulting color is then returned from the helper function and stored into the light accumulation buffer.

As you can see, the changes to the directional light entry point are very minor as most of the work is handled inside the helper function ApplyFog. Adding the fog calculation to the rest of the deferred and forward light sources should be pretty straightforward. One thing to take into consideration is that fog also has to be applied to scene elements that don't get lit, like the sky or emissive elements. Again, all you have to do is call ApplyFog to get the final color with the fog effect.


In this article, we learned how to apply fog effect and add atmospheric scenes to the images.

Resources for Article :

Further resources on this subject:

You've been reading and excerpt of:

HLSL Development Cookbook

Explore Title
comments powered by Disqus