Mastering CryENGINE

By Sascha Gundlach , Michelle K. Martin
  • Instant online access to over 7,500+ books and videos
  • Constantly updated with 100+ new titles each month
  • Breadth and depth in over 1,000+ technologies
  1. Setting Up the Perfect Pipeline

About this book

CryENGINE is one of the most powerful real-time 3D engines available today; its rendering and real-time capabilities are unmatched. Mastering CryENGINE will take your CryENGINE skills to the next level, allowing you to unleash the full power of the engine. Whether you are working on a small mod or a large-scale production, the user-friendly steps and illustrations in this book will help you master the wide range of features of the CryENGINE toolkit.

The book begins with the setting up of a CryENGINE-ready production pipeline, further introducing you to the advanced features of CryENGINE. You will learn advanced techniques, best practices, and various methods utilized in production by CryENGINE veterans. You will also explore how to use the Lua scripting language to build more sophisticated gameplay elements.

Publication date:
April 2014


Chapter 1. Setting Up the Perfect Pipeline

Before the actual work on any new project can begin, you as a developer have to think about your production pipeline. Time spent on designing a robust pipeline is always time well invested. The larger the project ahead of you, the more important it is to set up a stable pipeline. In this chapter, we will discuss the following topics:

  • Production pipeline setup for CryENGINE projects

  • Using version control in CryENGINE projects

  • Setting up automated builds and build scripts

  • Integration of CryENGINE builds and versions

The goal of this chapter is to provide you with information and best practices on building a stable and flexible CryENGINE production pipeline.


What is a production pipeline?

In simple words, a CryENGINE production pipeline could be described as a series of operations you are performing with the engine in order to create your product. Things like exporting a 3D asset of compiling C++ code, for example, are parts of a typical CryENGINE pipeline. Our production pipeline also defines how and to what standards you perform all project-related tasks.

When working with CryENGINE, those project-related tasks can include:

  • Exporting a 3D asset

  • Compiling code

  • Creating automatic builds

  • Processing bug reports

  • Checking files into your version control system

A pipeline is basically a number of rules and guidelines you set for yourself and your team to work on your project. If those rules make sense and fit your project, your life will become a lot easier. If they do not fit your project or if they do not exist yet, you will have a higher chance of running into all kinds of problems.


Importance of a strong pipeline

No matter what CryENGINE project you are about to start, you will have weeks or possibly months of work ahead of you. Being as prepared as you can for this should be your goal. Planning and preparing your pipeline will help you save time and work more efficiently during the lifespan of your whole project.

A well thought out pipeline, which is standardized and enforced within your team, will increase production speed significantly. In this chapter, we will discuss the most important aspects of a CryENGINE production pipeline.

Overview of the production pipeline


Version control for CryENGINE projects

The decision of which version control system (VCS) to use is one of the most important pipeline decisions to make when you are planning your CryENGINE project.

It is also one of the first things that should be discussed, since a lot of other aspects of your production will depend on it.

What version control does for you

Version control is incredibly useful in any area of game production. What a VCS basically does is keep track of changes made to your files and allows you to review those changes and even helps you revert to the older versions of your files.

This means if you make a mistake or delete or lose a file, it is generally very easy to recover whatever you have lost. In addition to this, using a VCS makes it a lot easier to collaborate since you will always have an overview of what changes your team members made to the project files.

Production without version control

Even today, where VCSs such as SVN, Perforce, or Git have become very affordable, or even free, there are still teams out there working without the safety net of a version control system.


Not using any version control whatsoever is always a bad decision for larger projects.

Working from a shared folder

One method often used by less experienced mod teams is to work out of a shared folder, which is accessible for everybody from the network. While it might seem simple and easy to work this way with everybody just copying their files into the shared folder, there are a lot of things which can go wrong, which are as follows:

  • Files can get overwritten accidently

  • Code and script conflicts cannot be caught and resolved easily

  • Tracing back older changes becomes extremely difficult

With low cost version control systems being widely available today, there is no reason even for small teams to work this way. Setting up and maintaining a version control solution will of course consume a certain amount of time, but it is always time well invested.

Let's have a look at a real-life example. You are working on a CryENGINE game project and you discover a game breaking bug. Let's say someone on your team submitted something which broke the game. Now it is up to you to identify and fix the issue. Having no access to either the file history or changes done to the individual files will make it very difficult for you to solve the issue. However, in a project environment with a VCS setup, you could simply step backwards through the submitted changes to identify the file which was responsible for breaking your game.

Selecting a VCS for CryENGINE projects

When it comes to deciding which VCS to use for your CryENGINE project, your decision will be determined by your budget, the scale of your project, and possibly your personal preference.

You will have to choose between a centralized and distributed VCS. While a centralized VCS keeps all files on a central server, a distributed VCS mirrors the whole repository on each client. Both systems come with different upsides and downsides, but for CryENGINE, it makes no difference which type of system is used.

There are many VCSs available today, and they come in many flavors. Most commonly used VCSs for CryENGINE projects are as follows:

  • Perforce: This is sometimes also called P4 and is a professional centralized VCS, which mostly is the tool of choice for professional and larger size game teams. Perforce licenses are generally not free, but there are various license options which allow indie and mod teams to make use of the software without spending much money. CryENGINE has native support for Perforce and allows you to check in/out files directly from Sandbox.

  • SVN: This is also called Subversion and is a free, open source centralized VCS. It is widely used by smaller teams without a big budget, since it can be used without any cost.

  • Git: This is also a free to use open source VCS. It differs from Perforce and SVN by using a distributed architecture. In direct comparison to Perforce and SVN, Git can be quite difficult to use, especially for developers with a nontechnical background.

Setting up version control for CryENGINE

Once you have made your decision regarding which version control system to use for your project, it is time to set up your CryENGINE environment. Depending on your role in your game's production, certain aspects of this setup might be more or less interesting to you. For example, if you are a programmer, you might be less interested to learn about setting up your Photoshop or 3ds Max and skip ahead to the relevant coding topics.


Being able to check out levels, layers, or materials files directly from Sandbox without switching to your version control client is very comfortable and will speed up your workflow considerably.

Support for Perforce version control is integrated into the Sandbox editor. Sandbox will automatically check out the corresponding files when they are being modified.

When using SVN, Git, or any other system, files cannot be directly checked in/out from Sandbox. In this case, no further setup is necessary.

Perforce setup

The first step to setting up Sandbox to work with Perforce is to enable version control. This is done in the Sandbox preferences as follows:

  1. Open the Preferences window from the Tools menu.

  2. In the Preferences menu, go to General Settings | General.

  3. Make sure the Enable Source Control checkbox is checked.

This will tell Sandbox to use version control for file operations. CryENGINE will get the Perforce server information from the Windows' environment settings and you don't need to set them up manually. To review these environment settings in Perforce, click on Environment Settings in the Connection menu as seen in the following screenshot:

You can change the Server and Workspace settings here as seen in the following screenshot:

Once all your version control options are set up for CryENGINE, you will see a window as seen in the following screenshot when saving your level:

This is CryENGINE telling you that the level you are trying to save is write protected and probably version controlled. At this point, you have the choice to either overwrite your level files or check them out from Perforce.

Exploring digital content creation tools such as Photoshop and 3ds Max

When creating content for CryENGINE projects, you will most likely be using tools such as Photoshop, 3ds Max, or Maya to create 3D assets, animations, and textures.

While those tools are not directly connected to CryENGINE, it still makes sense to set them up to connect to the same version control system that is storing your code and level files.

There are a lot of plugins that exist for the various VCSs, which will allow you to check out files directly from Photoshop, Max, Maya, or XSI.

Perforce, for example, offers quite a comprehensive free set of plugins for all major tools on their website called Perforce Graphical Tools Plug-in (P4GT). This is basically a big collection of plugins for all kinds of tools and systems. You can find it at

For SVN and Git, there are several plugins available on the internet as well.

Visual Studio

If you will be modifying the source code of CryENGINE for your project, you will most likely be using Visual Studio to implement and compile your changes.

There is a Perforce add-on for all recent Visual Studio versions available. This plugin will automatically check out source files when modified, and add newly created files to the repository. This can save a lot of time when working with the source code.

The plugin can be installed from within the GUI of Visual Studio. To do so, open the Extension Manager option from the Tools menu. The plugin is called P4VS. The fastest way to find it is to search for the term P4 in the online gallery.

After installation, you have to activate the plugin in the Visual Studio options. In the menu Tools, choose Options. Then navigate to the section called Source Control. In the subsection Plug-in Selection, open the drop-down box and select P4VS. Then you can configure your connection settings.


All pictures are taken from the Visual Studio Version 2010 and might differ slightly from other versions.


When editing text files, such as XML or Lua files, you will likely use a text editor. We recommend Notepad++ or Microsoft XML Notepad 2007 as it has syntax highlighting and a range of helpful plugins, such as an XML syntax checker. There is also a plugin available for Perforce. When installed, this plugin will automatically check out files when you edit them and add them to the default change list in Perforce.

The plugin can be installed through the Plugin Manager in Notepad++. It can be opened via the Plugins menu. Choose the plugin called Perforce actions from the list of available plugins and select Install.

Just like CryENGINE, the plugin will work with the Perforce environment settings, which can be set up via the GUI as described in the section Setting up version control for CryENGINE.

Identifying CryENGINE project files to be excluded from version control

Only because you are making use of a VCS, you do not need to add every single file of your CryENGINE project to the depot. Checking in files that should not be version controlled can actually be counterproductive to your workflow. It is important to know which type of files to keep out of your VCS. Generally, all files created during the build process should never be checked in.


To keep your build clean, you should avoid checking files into your VCS that are automatically generated during the build process.

The files which are usually never checked into a VCS are as follows:

  • .DDS files: These are generated automatically by the CryENGINE resource compiler. You should not create a .DDS file manually and check them into your VCS. The engine will take any existing .TIF file found in your project and compile an optimized .DDS texture, which will be used during runtime. When those .DDS files are locked and write protected by a VCS, they cannot be overwritten by the resource compiler. When this happens, your build will start using .DDS files, which can cause rendering problems and graphical artifacts.

  • Binaries: Usually, CryENGINE executable files such as the GameSDK.exe, Editor.exe, and the various CryENGINE .DLL files are not checked into your VCS since they are regularly built by your project's coders, locally on their machines. While all source code files should always be checked in, those executables will normally be rebuilt every day and do not need to be checked in.

  • .GFX files: When Scaleform is being used in production, .GFX files will be created by gfx-exporter. Those files should not be checked in but ought to be generated automatically during the build process. The gfx exporter might generate .DDS files as part of the .GFX export process for certain files, hence this step should be done as part of the automated build process.


Automated builds and build scripts

Having a system in place to create CryENGINE project builds automatically on a regular schedule can be very useful. Having the ability to switch back and forth between older and newer builds can be vital when hunting down difficult-to-fix bugs and other problems.

Although CryENGINE builds can be created manually and then copied to a storage location, it is much easier to set up a system to automate these tasks.

Creating nightly builds

Professional game development teams often have a procedure in place to create so called nightly builds. Those are builds created by a build server every night and distributed to the team the next morning. Level designers, artists, and other developers who do not directly work with C++ code or scripts are very comfortable working with nightly builds.

Those team members can just copy a fresh build every morning and can rely on the fact that everything has been properly compiled with all the latest code changes, and all .DDS files are properly created by the resource compiler. These automated builds can also serve as release candidates for your project and be used for QA testing.

Setting up a build server

In order to create automatic builds, you need to set up and configure a build server first. Your build server can be anything from a regular PC to a specifically designated workstation. Unless you are working on a really large scale project, any regular PC will do the job. You do not need to buy expensive server hardware if you are just intending to make a couple of builds per day.

In larger production environments where one build server has to handle builds for multiple large projects, stronger hardware will be needed, while for a single project with only one or two automatically generated builds per day, a normal desktop PC will do the job. Build servers for larger team sizes usually compile the code several times a day in order to catch bad check-ins as fast as possible. The more the coders work on the codebase, the more frequently should the server run the auto compilation. Professional teams of larger size will usually have dedicated personnel responsible for setting up the build servers and creating and maintaining build scripts called build engineers.

Your build server should not be the same PC you are working on, since the process of compiling a build will of course slow your machine down considerably. Using a dedicated build server rather than a local workstation also eliminates the risk of local changes ending up in an automated build.

Operating systems

Since the build process requires you to run the CryENGINE resource compiler as part of the asset compilation process, the build server should be running a version of Microsoft Windows. The CryENGINE resource compiler currently runs only on Microsoft Windows operating systems. Using Microsoft Windows also has the advantage that you can use the Windows scheduler to have your build scripts run automatically every night.

What build scripts should do

Once you have your build server hardware set up, it is time to create a set of build scripts, which will take care of automatically producing builds for you. Your goal will be to create a build script which takes all the latest changes done to your project by all team members and compile a new and clean build from them. Your build script should perform the following tasks:

  • Gathering all the latest C++ code and Lua scripts

  • Generating .DDS files for all existing .TIF files

  • Distributing the completed build to a central network location

Depending on your project, there could be several optional tasks you might want your build scripts to perform, such as the following:

  • Performing automated performance tests

  • Creating automated level benchmarks

  • Exporting all your game's levels

  • Uploading an archived version of the build to an FTP location

  • Creating change logs containing all changes done

A typical build process looks like this:

Overview of the build process

Creating your custom build script

Now it is time to get your hands dirty and create your own build script. But don't worry, you won't have to write it all on your own, since you will be provided with a sample build script, which you can use as a base and modify to suit your needs. There is a variety of scripting languages available you could use to create your build scripts. In the following example, we are using simple batch files. Of course, you are free to use a more sophisticated scripting language, such as Python.

For this example build script, we are using Perforce and some additional software, specifically:

  • Beyond Compare 3

  • WinRAR

  • Visual Studio 2010

  • Python 3.0 or higher


There is no need to go out and buy all this software if you just want to test the build scripts. You can just use the trial versions of the software and then decide later whether you would like to keep on using them.

Before starting with the actual scripts, the computer that will function as the build server needs to be set up properly. The following steps need to be done:

  1. Install the following software:

    • Beyond Compare 3

    • WinRAR

    • Perforce

    • Visual Studio 2010

    • Python 3.0 or higher

  2. Set up a Perforce user for use in the build script, called a buildbot.

  3. Create a folder for the build scripts.

  4. Create a folder for the builds to be stored.

  5. Download and extract the build script bundle from <webadress>.

  6. Modify the settings inside CreateFullBuild.bat with your data.

The example build scripts provided in this book require the use of a so-called Diff software. In the examples provided here, we use the software Beyond Compare 3. The software can be downloaded from

Of course, different types of software can be used and you will just need to adjust the appropriate portions of the build script. In order to follow the examples provided in this book, the free trial version of Beyond Compare 3 will be sufficient.

This section assumes that you have a physical build server and a VCS set up already. We will be using Perforce in our example build scripts, but you can easily adjust the scripts to use SVN, Git, or any other VCS instead.

Let's have a look at the tasks that the build bot needs to perform:

  • Getting the latest code and assets from Perforce

  • Coping relevant data to a work folder

  • Compiling the code in 32- and 64-bit

  • Compiling the assets

  • Creating the PAK files

  • Packing up the code

  • Moving build to the target folder

The actual build script is rather long, so let's break it down into smaller sections and look at each section individually. To make it easier for you, the places in the script that you need to change and replace with your own folder names and project dependent settings will be pointed out in detail.

In your production environment, you can use the files from this script bundle or write your own.

Writing your own script

To write your own script, start by creating a new text file inside your build scripts folder and call it CreateBuild.bat. Then edit it with a text editing program, such as Notepad++ and copy or type the script lines from this book. To make things easier, a text editor with syntax highlighting should be used.

The purpose of this first part of the script is just to print out some status information as follows:

@echo off
echo Creating a new full projectbuild
echoNew build started at: %date% %time%


Downloading the example code

You can download the example code files for all Packt books you have purchased from your account at If you purchased this book elsewhere, you can visit and register to have the files e-mailed directly to you.

These first six lines are simply clearing the console and printing the current time and date onto the screen (or into a logfile). Nothing is actually happening yet. You can safely delete and modify these lines as you like.

Printing out console output is always helpful as it gives some feedback as to where in the script the process currently is. A full build on a regular PC can take over an hour, depending on the amount of assets that need to be compiled. Knowing the current stage of the build process can be helpful to you.

These outputs will give you an insight on where your build failed if it was unsuccessful.

The next lines of the script focus on a general folder setup as follows:

set BuildTargetPath=G:\BuildArchive\ProjectName_%date:~10,4%_%date:~4,2%_%date:~7,2%
setVisualStudioPath=C:\Program Files\Microsoft Visual Studio 2010\Common7\IDE
setBeyondComparePath= C:\Program Files\Beyond Compare 3\
setWinRarPath=C:\Program Files\WinRAR\Winrar.exe

The first command in this block of code sets the drive that the build scripts and work in progress folder for the build will be located in; in our case, this is the G drive. You will need to adjust this to the drive that your temporary build folder is located in (as set in BuildWorkPath).

The next four lines set the folder paths that are relevant for the build server. You will need to adjust each of these to point to your own folders.

BuildBotPath should point to the folder that contains the build scripts, the preceding script included. The script will need to call helper scripts and expects these to be located in this folder. Specifying a script path at the beginning of the script makes it easy to create custom versions of the build scripts for different projects on the same server.

BuildSourcePath is the path to where the local repository of the project is located on the hard disk of the build server. In case the build server is also used as a work machine, this should not be the path to the local work directory. The build bot should have its own login and local work directory for the versioning system.

BuildWorkPath is a temporary folder that is only used for copying and compiling the build. After it is done, it will be moved to a final target folder. This folder is usually shared on a network drive. The move to the final folder doesn't happen until the entire build script is done, to avoid users copying down unfinished builds.

BuildTargetPath is the final target folder for the build. This will be autogenerated from the project name and the current date. If you are planning on having more than one build per day, you could consider adding a time stamp to the folder. This autogenerated folder name will make sorting the builds by date very simple.

Lastly, you will need to provide the installation locations for Beyond Compare 3, WinRar, and Visual Studio, in the variables VisualStudioPath, VisualStudioPath, and WinRarPath respectively.

Getting the latest files from your version control

Now, it's time to update the code and assets from Perforce as follows:

echo Retrieving the latest from the version control
p4 sync -q //PROJECTNAME/...#head
echo Done.

In this part of the script, you will need to replace everything that is printed in CAPITAL letters with your own version control data. The //PROJECTNAME/ term in the fourth line is the depot path of your repository that you want to retrieve. If you have chosen SVN or Git as your version control software, you will need to replace the appropriate calls.

If you are using your PC as a work station and build server simultaneously, and have more than one workspace mapped to the same machine, you might need to change the lines to the following, since the environment variables will probably not be set to the build server's user and workspace. This also applies if you don't have a password set up for your build bot user, as follows:

p4 -c WORKSPACE -u USERNAME sync -q //PROJECTNAME/...#head

Now it is time to copy all the relevant data into the temporary work folder and compile it. This folder might or might not exist yet. If it does, it needs to be cleared first so that no old data mixes with the new clean build as follows:

if exist "%BuildWorkPath%" (
echoClearing out temporary Work Folder
echo %BuildWorkPath%
rmdir /S /Q "%BuildWorkPath%"

Next, the data can be copied from the local repository folder to the work folder as follows:

echo Copy build relevant data
echo to %BuildWorkPath%
mkdir "%BuildWorkPath%"
cd %BuildBotPath%
%BeyondComparePath%/closescript "@CopyBuildScript.txt" 

This bit of the script calls upon Beyond Compare, a folder diff tool, to copy the build into the work folder. This is done because not all of the files from the repository are required for the build. Source assets, for example, Photoshop, 3ds Max, or ZBrush files need to be removed. These files are commonly rather large and would unnecessarily bloat up the build size. Also, the build server's task is to create release candidates, and source assets are usually not shipped.

Beyond Compare 3 can filter out all files that are not desired in the final released build. The script bundle available for download with this book includes a script that filters out the most common source asset types. The script is called CopyBuildScript.txt and is called upon in the preceding script block.

If Beyond Compare 3 is installed in a different location on your computer, you will need to adjust the path to in the script. If you are using a different folder diff tool, you can replace this part completely with the appropriate calls to your tool.

Another function the script performs is to remove the read-only flag on all copied files. Many version control systems such as Perforce set a read-only flag on all files in the local repository unless they are checked out, and the script takes care of this. The files need to be writable so that build script can compile, pack, and delete files later.

Compiling the code

Now the code can finally be compiled as follows:

mkdir "%BuildWorkPath%\Logfiles"
cd %BuildWorkPath%\Code\Solutions

echo Compiling 64 Bit
"%VisualStudioPath%\" CryEngine.sln /rebuild "Profile|x64" > "%BuildWorkPath%\Logfiles\Log_64Bit.txt"
echo Compiling 32 Bit
"%VisualStudioPath%\" CryEngine.sln /rebuild "Profile|Win32" > "%BuildWorkPath%\Logfiles\Log_32Bit.txt"
echo Done.

The first half of this script block creates a subfolder within the build folder to store the logfiles in. This is optional but can be very useful if there are any compilation errors for either code or assets. It is a good first place to start looking in when a build fails.

Next, the command-line version of the Visual Studio compiler is called upon to compile the first 64-bit and then 32-bit of the project.

Then, the path to the Visual Studio installation is set to the default installation location. If your Visual Studio is installed in a different directory, you will need to adjust this path.

You will also need to replace the CryEngine.sln filename with your own solution filename should you change it.

Logfiles will be saved to the Logfiles subfolder. The FreeSDK release solution file is usually named differently than the full source release.


At the time of writing this book, the default solution file for the CryENGINE release requires the Visual Studio 2010 compilers. Future releases of CryENGINE might require Visual Studio 2012 or up. You will need to adjust the folder path in this case.

Compiling the code will create temporary files, such as for example *.pdb files containing debug information. These files need to be removed as they should not be part of a release candidate. They would increase the build size and could potentially be used to reverse engineer your code.

The following part of our build script will remove these files and folders:

echo Removing temporary files from Code Build
cd %BuildWorkPath%\Bin32
del *.lib
del *.pdb
del *.exp
cd %BuildWorkPath%\Bin64
del *.lib
del *.pdb
del *.exp
cd %BuildWorkPath%
rmdir /S /Q BinTemp

These instructions are for a full source build of CryENGINE. If only the game code is compiled, the lines concerning Cry Action can be removed.


Instead of deleting the map and pdb files, you can also choose to archive them somewhere on your server. This will allow you to track down crashes that are reported from your end users, if they submit you a callstack.

Compiling the assets

The code is compiled now, so it is time to take care of the assets as follows:

echo Compile Assets
cd %BuildWorkPath%
echo Compiling Objects...
.\Bin32\RC\rc.exe .\Game\Objects\* /p=PC /ext_dds /ext_cba /recursive /threads=cores /processes=cores > ".\Logfiles\AssetCompilationLog_Objects.txt"
echo Compiling Libs...
.\Bin32\RC\rc.exe .\Game\Libs\* /p=PC /ext_dds /ext_cba/recursive /threads=cores /processes=cores > ".\Logfiles\AssetCompilationLog_Libs.txt"
echo Compiling Textures...
.\Bin32\RC\rc.exe .\Game\Textures\* /p=PC /ext_dds /ext_cba /recursive /threads=cores /processes=cores > ".\Logfiles\AssetCompilationLog_Textures.txt"
echo Compiling Materials...
.\Bin32\RC\rc.exe .\Game\Materials\* /p=PC /ext_dds /ext_cba /recursive /threads=cores /processes=cores > ".\Logfiles\AssetCompilationLog_Materials.txt"
echo Compiling Levels...
.\Bin32\RC\rc.exe .\Game\Levels\* /p=PC /ext_dds /ext_cba /recursive /threads=cores /processes=cores > ".\Logfiles\AssetCompilationLog_Levels.txt"
Echo Done.

The preceding line makes several calls to the resource compiler located by default in the RC folder under Bin32, shipped with any version of CryENGINE, to compile the assets. The resource compiler's output will be piped into a separate text file inside the Logfiles folder. Asset compilation will run over all data files (geometry, images, and so on) and process it. For images, this means compiling and converting them into platform specific and optimized .DDS files.

The XML file that contains the job description is provided with the CryENGINE build. However, if you have changed your default game folder from GameSDK to a folder of a different name, you will need to open the RCJob_Build_SDK_no_scripts.xml file in a text editor and adjust the folder name in the default properties at the top of the file.

This step in the build process usually takes the longest. Using a build server with multiple CPU cores can significantly speed up the process. While giving a definitive speed advantage, multiple threads will create a less detailed log output. To get a log entry for every asset that was processed and error codes, remove the parameters /threads=cores /processes=cores from the call in the preceding script.


To get a list of all the console parameters for the resource compiler in a text file, type ./Bin32/RC/rc.exe /help > RCCommands.txt in the console in the build's root folder. It will create a file RCCommands.txt in the root, containing documentation for each parameter.

For projects that create automated builds several times a day, it is sensible to create a separate build script that will only compile the code and copy the already compiled assets from the last build into it, for a faster turnaround time.

After all assets are processed by the resource compiler, the now duplicate .TIF file can be removed as follows:

echo Remove redundant tif files
cd %BuildWorkPath%\Game
copy "%BuildBotPath%\" .\.

After the processing of the assets, the resource compiler will have created a DDS file for all .TIF files it could find in the directories. The .TIF files are considered source files and should not be shipped to end users. They are also usually rather large and would unnecessarily bloat up the build. The Python script traverses through the game folder and its subfolders and removes all those .TIF files that have a corresponding .DDS file. This script is included in the script bundle available for download and requires Python to run.

After this clean-up step, the assets need to be compressed into PAK files:

echo creating PAK files and deleting original folders
cd %BuildWorkPath%\Game

echo Creating Objects.pak
"%WinRarPath%" a -r .\Objects
rename Objects.pak
rmdir /S /Q Objects

echo Creating Animations.pak
"%WinRarPath%" a -r .\Animations
rename Animations.pak
rmdir /S /Q Animations

echo Creating GameData.pak
"%WinRarPath%" a -r .\Entities .\Libs .\Scripts .\Prefabs .\Fonts
rename GameData.pak
rmdir /S /Q Entities
rmdir /S /Q Libs
rmdir /S /Q Scripts
rmdir /S /Q Prefabs
rmdir /S /Q Fonts

echo Creating Sounds.pak
"%WinRarPath%" a -r .\Music .\Sounds .\Languages .\Localized
rename Sounds.pak
rmdir /S /Q Music
rmdir /S /Q Sounds
rmdir /S /Q Languages
rmdir /S /Q Localized

echo Creating Textures.pak
"%WinRarPath%" a -r .\Materials .\Textures
rename Textures.pak
rmdir /S /Q Materials
rmdir /S /Q Textures

PAK files are simple ZIP files with a different ending. Common compression software, such as WinRAR and 7-Zip can open and create these files. In this example, we will use WinRAR to create the PAK files.

After each PAK file is created, the source folders are deleted with the rmdir commands. This setup will create five different PAK files, combining the various folders into easily shippable containers.


It is possible to let the resource compiler compile the assets and create the PAK files automatically by using a job XML configuration file. Most CryENGINE releases ship with one or more example job XML files inside the RC folder under Bin32.

These example files usually need to be heavily modified before it will work for custom builds. The names and contents of these files change with each release so they are not used in this build script example to prevent version conflicts.

After this has finished, the source code can be either packed or removed as follows:

echo -- Zip the Code folder (and delete) --
cd %BuildWorkPath%
"%WinRarPath%" a -r .\Code
rmdir /S /Q Code

The zipped code should of course not be shipped to end users. However, it can be useful to keep the code that was used for compilation as part of the build inside the internal build archive. It sometimes becomes necessary during production to use an older build, for example, for demonstrations or for bug hunting. In these cases, it can be extremely useful to be able to access the code used for the creation of the build quickly and without version control hassle.

After this last step, the build is finished. It can now be moved to the target location, usually on a shared network drive, as follows:

echo -- Rename Build folder --
cd %BuildWorkPath%
move "%BuildWorkPath%" "%BuildTargetPath%"

echo Build Finished: %date% %time%
echo ====================
echo Build Done.

The build script developed in this chapter is by no means the single and only way to create CryENGINE builds. This should serve as base line to get you started. During the development of your project, you will most likely want to extend and customize the build scripts to better suit your needs.

Wrapping it up

The preceding build script will output some status messages onto the console it was started from. For builds started manually, this works fine. If the console output is collected as well, a simple wrapper script can call the build script and pipe its entire output into a logfile.

The following code snippet is an example of such a wrapper script. It is included in the scripts bundle and the file is called BuildWithLogFile.bat:

This script will create a logfile called BuildLog.txt and move it into the Logfiles folder of the newly created build after it is finished. The usage of such a wrapper script is recommended for automated builds.

@echo off

echo Starting new Build
echo ======================================
echoLogfile will be saved to BuildLog.txt
echo Build Started: %date% %time%

callCreateBuild.bat > "BuildLog.txt"

move "%BuildBotPath%\BuildLog.txt" "%BuildTargetPath%\Logfiles\BuildLog.txt"
echo ======================================
echo Build done.

Scheduling automated builds

To create builds at a fixed time during the day, we need to start the build script automatically at specified times. This can be done with the Windows Task Scheduler. This feature allows you to run certain programs or scripts repeatedly at predefined dates and times.

The task scheduler can be accessed via the Windows Control Panel. Go to System and Security and then find the task scheduler under the section called Administrative Tools.

Create a new task by selecting Create Basic Task from the Actions bar on the right. Enter a name and a brief description for the new task, for example, Daily Overnight Build.

Next, you will need to specify a frequency at which to run the build script. For fulltime development projects with a dedicated build server, we recommend creating a build at least daily. For hobby projects, or projects with a smaller team size, creating weekly builds might be enough. Creating too frequent builds with very little changes in between them will only be a waste of hard disk space, so choose according to your project needs.


Choosing the weekly frequency will give you the most customization options. After selecting this setting, you can choose the individual days of the week on which the task should run. For hobby and mod teams working mainly on the weekend, daily builds on Friday through Monday might not make so much sense. Development teams working fulltime will probably want to choose a setting of one build daily during the work week, but none on the weekends.

The task creation wizard seen in the following screenshot makes it easy to schedule automatic builds:

In many development studios, the build server will also host the version control system, internal documentation, and bug tracking services and potentially serve as a network drive.

To prevent the automated builds from causing a performance hit on the server during work hours, it is sensible to schedule the task at a time when it is most likely that no one needs the server for other purposes, such as during the night or the early morning hours.

After the frequency has been set, we can define what action should be performed when the task is executed. Choose Start a Program from the list and click on Next.

The next page of the wizard will ask you to provide the program or script that should be run. Use the Browse button to select the BuildWithLogFile.bat script. Also, set the path that contains the build scripts in the edit field labeled Start in.

Hit Next and Finish to finalize the process. If you want to test your task, you can either set the frequency to a one-time run a few minutes from the current time, or manually select the task from the Task Scheduler Library and then Run from the right-click menu.

A console window should open showing the log output from the build script.

The automated task will create clean builds of your project at the set frequency for as long as you keep the task active in the task scheduler. The builds will be collected in the build archive folder you specified in the script. From time to time, you might want to go through that folder and delete older builds or archive them to another storage location.


Don't delete all of your old builds as they can be extremely handy at the end of your development when tracking down resilient and hard-to-find bugs. Keep at least one build from every month's development to help you narrow down timeframes when a certain bug first appears. This makes the process of finding the culprit changelist a lot faster.


Automated performance tests

Besides creating the actual CryENGINE builds on a regular schedule, it can be quite useful to create automatic performance benchmarks.

CryENGINE is a high performance 3D engine, which is able to render beautiful scenes at a high frame rate. However, there is always a chance that certain changes or additions made to your game project will affect the performance negatively. A line of badly written code or a few un-optimized assets might lower your frame rate considerably.

Being able to catch those performance issues early on is very important. The sooner a performance issue can be identified, the sooner it can be fixed.

Automated performance tests can help you spot those issues as soon as they appear.

Performance file generated using the save level stats command

Using level statistics to profile the performance of a level

CryENGINE has the ability to generate so-called level statistics files, which contain a lot of information about the performance situation of the level. By reviewing and comparing those files, it is easy to identify any existing performance problems. The level statistics file for a level can be generated manually using the savelevelstats console command. Let's give it a try and generate a level statistics file for your level.

  1. Open the level you want to profile in Sandbox.

  2. Type savelevelstats into the console.

Depending on your computer and the size of your level, it might take several minutes to generate the level statistics file.

The higher the object count of your level, the longer the process will take. The savelevelstats command will collect detailed information about every object in your level, hence the higher the object count, the longer the process will take.

Once the statistics file has been generated, let's open it up and have a look at the details.

  1. Go to the …/TestResults folder.

  2. Open levelname.xml.

You can see that the statistics files have been generated as .xml files. You will need to use MS Excel to view the files. If you do not have access to MS Excel, you can just download the free Excel file viewer from

Now, let's have a look at what has been generated. Take a moment and browse through the information contained in the files. You will see that there is a lot of relevant information contained in the documents. Everything from the overall loading time of your level up to the polygon count of individual assets can be found. You can use all this information to create a comprehensive picture of the performance situation of your build.

A more detailed description of the individual entries in the level statistics file can be found in the official CryENGINE documentation found at

You might have noticed that the level statistics files we just generated, although being very comprehensive, only show a snapshot of the performance situation of our level. They provide us with performance data from just the point in time they have been created.

Only collecting performance data over a longer period of time will allow you to see certain trends and developments.

Once your build scripts are set up to generate the level statistics automatically, you could also combine the gathered information in one easy-to-read document.


Build integration

One of the bigger challenges when working with the CryENGINE codebase is integrating new versions of the CryENGINE SDK. Crytek's engine team is constantly changing and updating the engine. New features get added while others get optimized or updated. New versions of the SDK are released multiple times per year.

Although there is no need to upgrade and integrate those new versions into your projects codebase, you usually want to upgrade sooner or later, since the updates contain many new features and optimizations which will benefit your game.

Depending on the size and scope of your project, integrating new versions of the SDK can be anything from easy to very difficult.

In the next paragraph, we will be discussing the best possible way to integrate new versions of the CryENGINE SDK.

Integrating a new version of CryENGINE

When incorporating updates into your build, you will need to merge the changes made by Crytek with the changes made by your development team. This process is called integration. Doing this manually is no small task even in a small production environment, and becomes nearly impossible in larger teams due to the sheer number of changes made, especially to the source code.

A version control system can make this process a lot easier, as it comes equipped with integration and merging tools. If versioning was used from the start of a project, the software will have tracked a history of all changes and know how to merge them together. We will briefly describe how to set up your repository to make integration of a new CryENGINE release as easy as possible.

The engine depot

When starting with a new project, the first repository you should create is one for the unmodified CryENGINE version, the so-called vanilla build. Unless you are planning to use the assets shipped with the SDK, this repository should only contain the Entities, Scripts, and Libs folders from the release.

Do not check in any of your project changes into this repository. This depot will only serve to track the changes between the individual releases of CryENGINE.


If you are not working with a full source license, you will be using most of the DLL files provided with the build release. In this case, it is recommended to check in all DLL files except the GameDLL file into the versioning system as well.

Project branch

Create a new branch from the engine depot that you created. The steps to create a new branch are different in every version control system. In Perforce, you can create new branches directly via the GUI. Right-click on the depot folder containing the engine and choose Branch Files.... Then specify a name for the new target branch and confirm by clicking on Branch. You can now start making your project-specific changes and also check in your project's assets.


If you have already started on a project and don't have a version history, you can still follow this workflow. Check in the vanilla build and create a project branch as described. Then delete the contents of the newly created local project folder and copy your current project into the same folder. Then choose Reconcile Offline Work in the Perforce GUI to create the version history.


When a new version of CryENGINE is released, you will need to perform two steps. First you need to update the engine branch in your depot and then you need to integrate and merge those changes into your project's branch.

The cleanest way to update your engine branch is to delete the local folder and replace it with the new release. Then let the version control software run a check on the differences. This is done differently in all the software. In Perforce, you can right-click on the engine depot folder in the GUI and choose Reconcile Offline Work. This will generate a changelist with the engine update.

Once this changelist is checked in, you need to integrate it into your project's repository. With Perforce, this is done by right-clicking on the engine branch folder and choosing Merge/Integrate. Choose your project's depot as the target folder and select Merge. This will generate a new change list with the relevant changes. Right-click on the changelist and select Resolve Files to merge the changes between your project and the engine update.

Before you can check in this changelist, you will need to resolve any conflicts. A conflict happens when the version control software doesn't know how to merge a file that has been changed by both your project team and the CryENGINE update. This can happen if the same line of code in a source file has been changed, for example. In this case, you need to manually merge the file. If there are any conflicts in your changelist, you will get a message window after the preceding step gets resolved.


Always compile the entire build and test it before you check in an engine update.


Quality assurance processes

Having quality assurance (QA) procedures in place for your game project is important and will help to keep an overview of the problems and issues in your game. No matter how thorough your work is and how prepared you are, there will always be bugs. The quality assurance part of your production pipeline is not supposed to make sure there are no bugs, but rather help you decide how to manage the bugs you will encounter in the course of your project.

The QA workflow of your production pipeline should be tailored to the size of your project. A small two-man team working on a CryENGINE mod will require a different approach than a 100-man team working on an AAA title. Depending on your team and project size, your solution could be anything from simply writing bugs down on a piece of paper to entering bugs into an online bug tracking system.

QA pipeline in larger scale teams

First, let's have a look at how professional large scale teams set up their QA pipeline. In those types of production environments, found in most AAA studios today, a bug will go through many stages before it is finally fixed.

QA pipeline in larger scale teams

You can see that this process is rather complex and involves many stages and possibly various people being involved.

A bug, also commonly called an issue, gets identified, confirmed, fixed, and then finally closed. Usually a bug-tracking software such as JIRA or Bugzilla is used to manage the issues.

The benefit of this process often employed by larger-sized teams is that it is very thorough. Working this way will make sure almost every bug will get caught, documented, and fixed.

This is a good approach if you work with a development team of a larger size and need to handle larger amounts of bugs. The bug tracking system you decide to use will depend on your specific needs and your available budget.

QA pipeline in smaller teams

If you are working in a smaller team, maybe with just one or two other people, your QA approach will be a bit different. It might be sufficient for you to just write down the issue in an issue tracking system and then revisit and fix the bug at a later time.

QA pipeline in smaller teams

Working without a QA pipeline

If you decide to work on your project without setting up any QA pipeline at all, you will sooner or later run into problems, since even small projects can produce a lot of bugs and issues.

Simply saying "I'll remember this bug and will fix it later" might sound easy but is not practical. No matter how thorough you are, if you do not track your issues properly, you will eventually end up shipping your game with them.


Understanding issue tracking in CryENGINE

When you create reports about new issues and bugs for your project, it is important to include as much relevant information as necessary. This will make life a lot easier for the person who has to fix the issue.

A badly written bug report can cause confusion and cost additional time. When you enter new bug reports into your bug tracking system, make sure you include all the information that is relevant in order to identify and fix the issue.

Relevant information for a CryENGINE issue includes the following:

  • CryENGINE build and version number

  • System specifications (hardware, DirectX version, and so on)

  • The crash log and call stack if available

  • The level name related to the problem (if applicable)

In addition to this, the issue should also be described as accurately as possible. The more information included, the easier it will be to fix the issue.



In this chapter, we focused on the different aspects of the CryENGINE production pipeline. You learned about the importance of planning and setting up essential elements of your pipeline including version control systems, automated builds, and quality assurance processes.

Following the guidelines and techniques detailed in this chapter will make it a lot easier for you to set up and maintain your project. You will be ready to scale your project with your team size, integrate engine updates, and stay ahead of any potential problems.

Remember that a stable and robust production pipeline is one of the most underestimated, yet most important components of your project.

Now, that all this preparation work regarding your pipeline is taken care of, things will get a bit more hands-on in the next chapter. You will get an introduction to the CryENGINE input system and learn how to modify and set up your game controls and input methods.

About the Authors

  • Sascha Gundlach

    Sascha Gundlach has been working in the games industry for over a decade and started his career as a script programmer in a small game studio in the early 2000s. He worked for Crytek for eight years, working on games such as Crysis, Crysis: Warhead, and Crysis 2.

    He is a CryENGINE expert and has provided countless training sessions and individual training to CryENGINE licensees in the past years.

    In 2013, he founded his own game development company, MetalPop Games, together with his partner and Crytek veteran Michelle K. Martin in Orlando, Florida.

    He spends his days working on video game projects and provides consulting work for other game projects.

    Browse publications by this author
  • Michelle K. Martin

    Michelle K. Martin is a software engineer in the game industry, specializing in animation systems. She started her career with the German developer, Crytek, working on projects such as Crysis and Crysis 2. During her career, Michelle has helped develop and improve CryENGINE's animation system with several features. Being an expert in CryENGINE, she has provided a lot of support and training to CryENGINE licensees over the years, helping their team to get the most out of the engine.

    In 2013, she founded MetalPop Games together with her partner and Crytek veteran Sascha Gundlach. It is an indie game development studio and they are currently working on their first title.

    When she's not in front of the computer programming, she is most likely to be in front of the computer playing games.

    More about Sascha and Michelle's company MetalPop Games can be found at

    Browse publications by this author
Book Title
Access this book, plus 7,500 other titles for FREE
Access now