Before the actual work on any new project can begin, you as a developer have to think about your production pipeline. Time spent on designing a robust pipeline is always time well invested. The larger the project ahead of you, the more important it is to set up a stable pipeline. In this chapter, we will discuss the following topics:
Production pipeline setup for CryENGINE projects
Using version control in CryENGINE projects
Setting up automated builds and build scripts
Integration of CryENGINE builds and versions
The goal of this chapter is to provide you with information and best practices on building a stable and flexible CryENGINE production pipeline.
In simple words, a CryENGINE production pipeline could be described as a series of operations you are performing with the engine in order to create your product. Things like exporting a 3D asset of compiling C++ code, for example, are parts of a typical CryENGINE pipeline. Our production pipeline also defines how and to what standards you perform all project-related tasks.
Exporting a 3D asset
Creating automatic builds
Processing bug reports
Checking files into your version control system
A pipeline is basically a number of rules and guidelines you set for yourself and your team to work on your project. If those rules make sense and fit your project, your life will become a lot easier. If they do not fit your project or if they do not exist yet, you will have a higher chance of running into all kinds of problems.
No matter what CryENGINE project you are about to start, you will have weeks or possibly months of work ahead of you. Being as prepared as you can for this should be your goal. Planning and preparing your pipeline will help you save time and work more efficiently during the lifespan of your whole project.
A well thought out pipeline, which is standardized and enforced within your team, will increase production speed significantly. In this chapter, we will discuss the most important aspects of a CryENGINE production pipeline.
Version control is incredibly useful in any area of game production. What a VCS basically does is keep track of changes made to your files and allows you to review those changes and even helps you revert to the older versions of your files.
This means if you make a mistake or delete or lose a file, it is generally very easy to recover whatever you have lost. In addition to this, using a VCS makes it a lot easier to collaborate since you will always have an overview of what changes your team members made to the project files.
Even today, where VCSs such as SVN, Perforce, or Git have become very affordable, or even free, there are still teams out there working without the safety net of a version control system.
One method often used by less experienced mod teams is to work out of a shared folder, which is accessible for everybody from the network. While it might seem simple and easy to work this way with everybody just copying their files into the shared folder, there are a lot of things which can go wrong, which are as follows:
Files can get overwritten accidently
Code and script conflicts cannot be caught and resolved easily
Tracing back older changes becomes extremely difficult
With low cost version control systems being widely available today, there is no reason even for small teams to work this way. Setting up and maintaining a version control solution will of course consume a certain amount of time, but it is always time well invested.
Let's have a look at a real-life example. You are working on a CryENGINE game project and you discover a game breaking bug. Let's say someone on your team submitted something which broke the game. Now it is up to you to identify and fix the issue. Having no access to either the file history or changes done to the individual files will make it very difficult for you to solve the issue. However, in a project environment with a VCS setup, you could simply step backwards through the submitted changes to identify the file which was responsible for breaking your game.
You will have to choose between a centralized and distributed VCS. While a centralized VCS keeps all files on a central server, a distributed VCS mirrors the whole repository on each client. Both systems come with different upsides and downsides, but for CryENGINE, it makes no difference which type of system is used.
There are many VCSs available today, and they come in many flavors. Most commonly used VCSs for CryENGINE projects are as follows:
Perforce: This is sometimes also called P4 and is a professional centralized VCS, which mostly is the tool of choice for professional and larger size game teams. Perforce licenses are generally not free, but there are various license options which allow indie and mod teams to make use of the software without spending much money. CryENGINE has native support for Perforce and allows you to check in/out files directly from Sandbox.
Git: This is also a free to use open source VCS. It differs from Perforce and SVN by using a distributed architecture. In direct comparison to Perforce and SVN, Git can be quite difficult to use, especially for developers with a nontechnical background.
Once you have made your decision regarding which version control system to use for your project, it is time to set up your CryENGINE environment. Depending on your role in your game's production, certain aspects of this setup might be more or less interesting to you. For example, if you are a programmer, you might be less interested to learn about setting up your Photoshop or 3ds Max and skip ahead to the relevant coding topics.
Support for Perforce version control is integrated into the Sandbox editor. Sandbox will automatically check out the corresponding files when they are being modified.
When using SVN, Git, or any other system, files cannot be directly checked in/out from Sandbox. In this case, no further setup is necessary.
This will tell Sandbox to use version control for file operations. CryENGINE will get the Perforce server information from the Windows' environment settings and you don't need to set them up manually. To review these environment settings in Perforce, click on Environment Settings in the Connection menu as seen in the following screenshot:
You can change the Server and Workspace settings here as seen in the following screenshot:
This is CryENGINE telling you that the level you are trying to save is write protected and probably version controlled. At this point, you have the choice to either overwrite your level files or check them out from Perforce.
While those tools are not directly connected to CryENGINE, it still makes sense to set them up to connect to the same version control system that is storing your code and level files.
There are a lot of plugins that exist for the various VCSs, which will allow you to check out files directly from Photoshop, Max, Maya, or XSI.
Perforce, for example, offers quite a comprehensive free set of plugins for all major tools on their website called Perforce Graphical Tools Plug-in (P4GT). This is basically a big collection of plugins for all kinds of tools and systems. You can find it at http://www.perforce.com/product/components/perforce-plugin-graphical-tools.
There is a Perforce add-on for all recent Visual Studio versions available. This plugin will automatically check out source files when modified, and add newly created files to the repository. This can save a lot of time when working with the source code.
The plugin can be installed from within the GUI of Visual Studio. To do so, open the Extension Manager option from the Tools menu. The plugin is called P4VS. The fastest way to find it is to search for the term P4 in the online gallery.
After installation, you have to activate the plugin in the Visual Studio options. In the menu Tools, choose Options. Then navigate to the section called Source Control. In the subsection Plug-in Selection, open the drop-down box and select P4VS. Then you can configure your connection settings.
When editing text files, such as XML or Lua files, you will likely use a text editor. We recommend Notepad++ or Microsoft XML Notepad 2007 as it has syntax highlighting and a range of helpful plugins, such as an XML syntax checker. There is also a plugin available for Perforce. When installed, this plugin will automatically check out files when you edit them and add them to the default change list in Perforce.
The plugin can be installed through the Plugin Manager in Notepad++. It can be opened via the Plugins menu. Choose the plugin called Perforce actions from the list of available plugins and select Install.
Only because you are making use of a VCS, you do not need to add every single file of your CryENGINE project to the depot. Checking in files that should not be version controlled can actually be counterproductive to your workflow. It is important to know which type of files to keep out of your VCS. Generally, all files created during the build process should never be checked in.
To keep your build clean, you should avoid checking files into your VCS that are automatically generated during the build process.
The files which are usually never checked into a VCS are as follows:
.DDS files: These are generated automatically by the CryENGINE resource compiler. You should not create a
.DDSfile manually and check them into your VCS. The engine will take any existing
.TIFfile found in your project and compile an optimized
.DDStexture, which will be used during runtime. When those
.DDSfiles are locked and write protected by a VCS, they cannot be overwritten by the resource compiler. When this happens, your build will start using
.DDSfiles, which can cause rendering problems and graphical artifacts.
Binaries: Usually, CryENGINE executable files such as the
Editor.exe, and the various CryENGINE
.DLLfiles are not checked into your VCS since they are regularly built by your project's coders, locally on their machines. While all source code files should always be checked in, those executables will normally be rebuilt every day and do not need to be checked in.
.GFX files: When Scaleform is being used in production,
.GFXfiles will be created by gfx-exporter. Those files should not be checked in but ought to be generated automatically during the build process. The gfx exporter might generate
.DDSfiles as part of the
.GFXexport process for certain files, hence this step should be done as part of the automated build process.
Having a system in place to create CryENGINE project builds automatically on a regular schedule can be very useful. Having the ability to switch back and forth between older and newer builds can be vital when hunting down difficult-to-fix bugs and other problems.
Although CryENGINE builds can be created manually and then copied to a storage location, it is much easier to set up a system to automate these tasks.
Professional game development teams often have a procedure in place to create so called nightly builds. Those are builds created by a build server every night and distributed to the team the next morning. Level designers, artists, and other developers who do not directly work with C++ code or scripts are very comfortable working with nightly builds.
Those team members can just copy a fresh build every morning and can rely on the fact that everything has been properly compiled with all the latest code changes, and all
.DDS files are properly created by the resource compiler. These automated builds can also serve as release candidates for your project and be used for QA testing.
In order to create automatic builds, you need to set up and configure a build server first. Your build server can be anything from a regular PC to a specifically designated workstation. Unless you are working on a really large scale project, any regular PC will do the job. You do not need to buy expensive server hardware if you are just intending to make a couple of builds per day.
In larger production environments where one build server has to handle builds for multiple large projects, stronger hardware will be needed, while for a single project with only one or two automatically generated builds per day, a normal desktop PC will do the job. Build servers for larger team sizes usually compile the code several times a day in order to catch bad check-ins as fast as possible. The more the coders work on the codebase, the more frequently should the server run the auto compilation. Professional teams of larger size will usually have dedicated personnel responsible for setting up the build servers and creating and maintaining build scripts called build engineers.
Your build server should not be the same PC you are working on, since the process of compiling a build will of course slow your machine down considerably. Using a dedicated build server rather than a local workstation also eliminates the risk of local changes ending up in an automated build.
Since the build process requires you to run the CryENGINE resource compiler as part of the asset compilation process, the build server should be running a version of Microsoft Windows. The CryENGINE resource compiler currently runs only on Microsoft Windows operating systems. Using Microsoft Windows also has the advantage that you can use the Windows scheduler to have your build scripts run automatically every night.
Once you have your build server hardware set up, it is time to create a set of build scripts, which will take care of automatically producing builds for you. Your goal will be to create a build script which takes all the latest changes done to your project by all team members and compile a new and clean build from them. Your build script should perform the following tasks:
Gathering all the latest C++ code and Lua scripts
DDSfiles for all existing
Distributing the completed build to a central network location
Depending on your project, there could be several optional tasks you might want your build scripts to perform, such as the following:
Performing automated performance tests
Creating automated level benchmarks
Exporting all your game's levels
Uploading an archived version of the build to an FTP location
Creating change logs containing all changes done
Now it is time to get your hands dirty and create your own build script. But don't worry, you won't have to write it all on your own, since you will be provided with a sample build script, which you can use as a base and modify to suit your needs. There is a variety of scripting languages available you could use to create your build scripts. In the following example, we are using simple batch files. Of course, you are free to use a more sophisticated scripting language, such as Python.
For this example build script, we are using Perforce and some additional software, specifically:
Beyond Compare 3
Visual Studio 2010
Python 3.0 or higher
There is no need to go out and buy all this software if you just want to test the build scripts. You can just use the trial versions of the software and then decide later whether you would like to keep on using them.
Before starting with the actual scripts, the computer that will function as the build server needs to be set up properly. The following steps need to be done:
Install the following software:
Beyond Compare 3
Visual Studio 2010
Python 3.0 or higher
Create a folder for the build scripts.
Create a folder for the builds to be stored.
Download and extract the build script bundle from <webadress>.
Modify the settings inside
CreateFullBuild.batwith your data.
The example build scripts provided in this book require the use of a so-called Diff software. In the examples provided here, we use the software Beyond Compare 3. The software can be downloaded from http://www.scootersoftware.com/download.php.
Of course, different types of software can be used and you will just need to adjust the appropriate portions of the build script. In order to follow the examples provided in this book, the free trial version of Beyond Compare 3 will be sufficient.
This section assumes that you have a physical build server and a VCS set up already. We will be using Perforce in our example build scripts, but you can easily adjust the scripts to use SVN, Git, or any other VCS instead.
Let's have a look at the tasks that the build bot needs to perform:
Getting the latest code and assets from Perforce
Coping relevant data to a work folder
Compiling the code in 32- and 64-bit
Compiling the assets
Packing up the code
Moving build to the target folder
The actual build script is rather long, so let's break it down into smaller sections and look at each section individually. To make it easier for you, the places in the script that you need to change and replace with your own folder names and project dependent settings will be pointed out in detail.
To write your own script, start by creating a new text file inside your build scripts folder and call it
CreateBuild.bat. Then edit it with a text editing program, such as Notepad++ and copy or type the script lines from this book. To make things easier, a text editor with syntax highlighting should be used.
The purpose of this first part of the script is just to print out some status information as follows:
@echo off cls echo. echo Creating a new full projectbuild echo------------------------------------ echoNew build started at: %date% %time%
Downloading the example code
You can download the example code files for all Packt books you have purchased from your account at http://www.packtpub.com. If you purchased this book elsewhere, you can visit http://www.packtpub.com/support and register to have the files e-mailed directly to you.
These first six lines are simply clearing the console and printing the current time and date onto the screen (or into a logfile). Nothing is actually happening yet. You can safely delete and modify these lines as you like.
Printing out console output is always helpful as it gives some feedback as to where in the script the process currently is. A full build on a regular PC can take over an hour, depending on the amount of assets that need to be compiled. Knowing the current stage of the build process can be helpful to you.
G: setBuildBotPath=G:\p4\Build_Server\BuildScripts setBuildWorkPath=G:\BuildArchive\Build_InProgress setBuildSourcePath=G:\p4\Build_Server\ProjectName set BuildTargetPath=G:\BuildArchive\ProjectName_%date:~10,4%_%date:~4,2%_%date:~7,2% setVisualStudioPath=C:\Program Files\Microsoft Visual Studio 2010\Common7\IDE setBeyondComparePath= C:\Program Files\Beyond Compare 3\BComp.com setWinRarPath=C:\Program Files\WinRAR\Winrar.exe
The first command in this block of code sets the drive that the build scripts and work in progress folder for the build will be located in; in our case, this is the G drive. You will need to adjust this to the drive that your temporary build folder is located in (as set in
The next four lines set the folder paths that are relevant for the build server. You will need to adjust each of these to point to your own folders.
BuildBotPath should point to the folder that contains the build scripts, the preceding script included. The script will need to call helper scripts and expects these to be located in this folder. Specifying a script path at the beginning of the script makes it easy to create custom versions of the build scripts for different projects on the same server.
BuildSourcePath is the path to where the local repository of the project is located on the hard disk of the build server. In case the build server is also used as a work machine, this should not be the path to the local work directory. The build bot should have its own login and local work directory for the versioning system.
BuildWorkPath is a temporary folder that is only used for copying and compiling the build. After it is done, it will be moved to a final target folder. This folder is usually shared on a network drive. The move to the final folder doesn't happen until the entire build script is done, to avoid users copying down unfinished builds.
BuildTargetPath is the final target folder for the build. This will be autogenerated from the project name and the current date. If you are planning on having more than one build per day, you could consider adding a time stamp to the folder. This autogenerated folder name will make sorting the builds by date very simple.
echo. echo Retrieving the latest from the version control p4 -c WORKSPACE -p PERFORCE_SERVER:1666 -P PASSWORD -u USERNAME login p4 sync -q //PROJECTNAME/...#head echo Done.
In this part of the script, you will need to replace everything that is printed in CAPITAL letters with your own version control data. The
//PROJECTNAME/ term in the fourth line is the depot path of your repository that you want to retrieve. If you have chosen SVN or Git as your version control software, you will need to replace the appropriate calls.
If you are using your PC as a work station and build server simultaneously, and have more than one workspace mapped to the same machine, you might need to change the lines to the following, since the environment variables will probably not be set to the build server's user and workspace. This also applies if you don't have a password set up for your build bot user, as follows:
p4 -c WORKSPACE -u USERNAME sync -q //PROJECTNAME/...#head
Now it is time to copy all the relevant data into the temporary work folder and compile it. This folder might or might not exist yet. If it does, it needs to be cleared first so that no old data mixes with the new clean build as follows:
if exist "%BuildWorkPath%" ( echo. echoClearing out temporary Work Folder echo %BuildWorkPath% rmdir /S /Q "%BuildWorkPath%" )
Next, the data can be copied from the local repository folder to the work folder as follows:
echo. echo Copy build relevant data echo to %BuildWorkPath% mkdir "%BuildWorkPath%" cd %BuildBotPath% %BeyondComparePath%/closescript "@CopyBuildScript.txt" echoDone.
This bit of the script calls upon Beyond Compare, a folder diff tool, to copy the build into the work folder. This is done because not all of the files from the repository are required for the build. Source assets, for example, Photoshop, 3ds Max, or ZBrush files need to be removed. These files are commonly rather large and would unnecessarily bloat up the build size. Also, the build server's task is to create release candidates, and source assets are usually not shipped.
Beyond Compare 3 can filter out all files that are not desired in the final released build. The script bundle available for download with this book includes a script that filters out the most common source asset types. The script is called
CopyBuildScript.txt and is called upon in the preceding script block.
If Beyond Compare 3 is installed in a different location on your computer, you will need to adjust the path to
BComp.com in the script. If you are using a different folder diff tool, you can replace this part completely with the appropriate calls to your tool.
Another function the script performs is to remove the read-only flag on all copied files. Many version control systems such as Perforce set a read-only flag on all files in the local repository unless they are checked out, and the script takes care of this. The files need to be writable so that build script can compile, pack, and delete files later.
echo. mkdir "%BuildWorkPath%\Logfiles" cd %BuildWorkPath%\Code\Solutions echo Compiling 64 Bit "%VisualStudioPath%\devenv.com" CryEngine.sln /rebuild "Profile|x64" > "%BuildWorkPath%\Logfiles\Log_64Bit.txt" echo Compiling 32 Bit "%VisualStudioPath%\devenv.com" CryEngine.sln /rebuild "Profile|Win32" > "%BuildWorkPath%\Logfiles\Log_32Bit.txt" echo Done.
The first half of this script block creates a subfolder within the build folder to store the logfiles in. This is optional but can be very useful if there are any compilation errors for either code or assets. It is a good first place to start looking in when a build fails.
Next, the command-line version of the Visual Studio compiler is called upon to compile the first 64-bit and then 32-bit of the project.
Then, the path to the Visual Studio installation is set to the default installation location. If your Visual Studio is installed in a different directory, you will need to adjust this path.
You will also need to replace the
CryEngine.sln filename with your own solution filename should you change it.
Logfiles will be saved to the
Logfiles subfolder. The
FreeSDK release solution file is usually named differently than the full source release.
At the time of writing this book, the default solution file for the CryENGINE release requires the Visual Studio 2010 compilers. Future releases of CryENGINE might require Visual Studio 2012 or up. You will need to adjust the folder path in this case.
Compiling the code will create temporary files, such as for example *.pdb files containing debug information. These files need to be removed as they should not be part of a release candidate. They would increase the build size and could potentially be used to reverse engineer your code.
echo. echo Removing temporary files from Code Build cd %BuildWorkPath%\Bin32 del *.lib del *.pdb del *.exp delCryAction.map cd %BuildWorkPath%\Bin64 del *.lib del *.pdb del *.exp delCryAction.map cd %BuildWorkPath% rmdir /S /Q BinTemp
These instructions are for a full source build of CryENGINE. If only the game code is compiled, the lines concerning Cry Action can be removed.
echo. echo Compile Assets cd %BuildWorkPath% echo Compiling Objects... .\Bin32\RC\rc.exe .\Game\Objects\* /p=PC /ext_dds /ext_cba /recursive /threads=cores /processes=cores > ".\Logfiles\AssetCompilationLog_Objects.txt" echo Compiling Libs... .\Bin32\RC\rc.exe .\Game\Libs\* /p=PC /ext_dds /ext_cba/recursive /threads=cores /processes=cores > ".\Logfiles\AssetCompilationLog_Libs.txt" echo Compiling Textures... .\Bin32\RC\rc.exe .\Game\Textures\* /p=PC /ext_dds /ext_cba /recursive /threads=cores /processes=cores > ".\Logfiles\AssetCompilationLog_Textures.txt" echo Compiling Materials... .\Bin32\RC\rc.exe .\Game\Materials\* /p=PC /ext_dds /ext_cba /recursive /threads=cores /processes=cores > ".\Logfiles\AssetCompilationLog_Materials.txt" echo Compiling Levels... .\Bin32\RC\rc.exe .\Game\Levels\* /p=PC /ext_dds /ext_cba /recursive /threads=cores /processes=cores > ".\Logfiles\AssetCompilationLog_Levels.txt" Echo Done.
The preceding line makes several calls to the resource compiler located by default in the
RC folder under
Bin32, shipped with any version of CryENGINE, to compile the assets. The resource compiler's output will be piped into a separate text file inside the
Logfiles folder. Asset compilation will run over all data files (geometry, images, and so on) and process it. For images, this means compiling and converting them into platform specific and optimized
The XML file that contains the job description is provided with the CryENGINE build. However, if you have changed your default game folder from
GameSDK to a folder of a different name, you will need to open the
RCJob_Build_SDK_no_scripts.xml file in a text editor and adjust the folder name in the default properties at the top of the file.
This step in the build process usually takes the longest. Using a build server with multiple CPU cores can significantly speed up the process. While giving a definitive speed advantage, multiple threads will create a less detailed log output. To get a log entry for every asset that was processed and error codes, remove the parameters
/threads=cores /processes=cores from the call in the preceding script.
To get a list of all the console parameters for the resource compiler in a text file, type
./Bin32/RC/rc.exe /help > RCCommands.txt in the console in the build's root folder. It will create a file
RCCommands.txt in the root, containing documentation for each parameter.
For projects that create automated builds several times a day, it is sensible to create a separate build script that will only compile the code and copy the already compiled assets from the last build into it, for a faster turnaround time.
echo. echo Remove redundant tif files cd %BuildWorkPath%\Game copy "%BuildBotPath%\delete_redundant_tifs.py" .\. delete_redundant_tifs.py del delete_redundant_tifs.py
After the processing of the assets, the resource compiler will have created a
DDS file for all
.TIF files it could find in the directories. The
.TIF files are considered source files and should not be shipped to end users. They are also usually rather large and would unnecessarily bloat up the build. The Python script
delete_redundant_tifs.py traverses through the game folder and its subfolders and removes all those
.TIF files that have a corresponding
.DDS file. This script is included in the script bundle available for download and requires Python to run.
After this clean-up step, the assets need to be compressed into
echo. echo creating PAK files and deleting original folders cd %BuildWorkPath%\Game echo. echo Creating Objects.pak "%WinRarPath%" a -r Objects.zip .\Objects rename Objects.zip Objects.pak rmdir /S /Q Objects echo Creating Animations.pak "%WinRarPath%" a -r Animations.zip .\Animations rename Animations.zip Animations.pak rmdir /S /Q Animations echo Creating GameData.pak "%WinRarPath%" a -r GameData.zip .\Entities .\Libs .\Scripts .\Prefabs .\Fonts rename GameData.zip GameData.pak rmdir /S /Q Entities rmdir /S /Q Libs rmdir /S /Q Scripts rmdir /S /Q Prefabs rmdir /S /Q Fonts echo Creating Sounds.pak "%WinRarPath%" a -r Sounds.zip .\Music .\Sounds .\Languages .\Localized rename Sounds.zip Sounds.pak rmdir /S /Q Music rmdir /S /Q Sounds rmdir /S /Q Languages rmdir /S /Q Localized echo Creating Textures.pak "%WinRarPath%" a -r Textures.zip .\Materials .\Textures rename Textures.zip Textures.pak rmdir /S /Q Materials rmdir /S /Q Textures
PAK files are simple ZIP files with a different ending. Common compression software, such as WinRAR and 7-Zip can open and create these files. In this example, we will use WinRAR to create the
PAK file is created, the source folders are deleted with the
rmdir commands. This setup will create five different
PAK files, combining the various folders into easily shippable containers.
It is possible to let the resource compiler compile the assets and create the
PAK files automatically by using a job XML configuration file. Most CryENGINE releases ship with one or more example job XML files inside the
RC folder under
These example files usually need to be heavily modified before it will work for custom builds. The names and contents of these files change with each release so they are not used in this build script example to prevent version conflicts.
After this has finished, the source code can be either packed or removed as follows:
echo. echo -- Zip the Code folder (and delete) -- cd %BuildWorkPath% "%WinRarPath%" a -r Code.zip .\Code rmdir /S /Q Code
The zipped code should of course not be shipped to end users. However, it can be useful to keep the code that was used for compilation as part of the build inside the internal build archive. It sometimes becomes necessary during production to use an older build, for example, for demonstrations or for bug hunting. In these cases, it can be extremely useful to be able to access the code used for the creation of the build quickly and without version control hassle.
After this last step, the build is finished. It can now be moved to the target location, usually on a shared network drive, as follows:
echo. echo -- Rename Build folder -- cd %BuildWorkPath% cd.. move "%BuildWorkPath%" "%BuildTargetPath%" echo. echo Build Finished: %date% %time% echo ==================== echo Build Done. echo.
The build script developed in this chapter is by no means the single and only way to create CryENGINE builds. This should serve as base line to get you started. During the development of your project, you will most likely want to extend and customize the build scripts to better suit your needs.
The preceding build script will output some status messages onto the console it was started from. For builds started manually, this works fine. If the console output is collected as well, a simple wrapper script can call the build script and pipe its entire output into a logfile.
This script will create a logfile called
BuildLog.txt and move it into the
Logfiles folder of the newly created build after it is finished. The usage of such a wrapper script is recommended for automated builds.
@echo off cls echo Starting new Build echo ====================================== echoLogfile will be saved to BuildLog.txt echo Build Started: %date% %time% callCreateBuild.bat > "BuildLog.txt" move "%BuildBotPath%\BuildLog.txt" "%BuildTargetPath%\Logfiles\BuildLog.txt" echo ====================================== echo Build done.
To create builds at a fixed time during the day, we need to start the build script automatically at specified times. This can be done with the Windows Task Scheduler. This feature allows you to run certain programs or scripts repeatedly at predefined dates and times.
The task scheduler can be accessed via the Windows Control Panel. Go to System and Security and then find the task scheduler under the section called Administrative Tools.
Create a new task by selecting Create Basic Task from the Actions bar on the right. Enter a name and a brief description for the new task, for example,
Daily Overnight Build.
Next, you will need to specify a frequency at which to run the build script. For fulltime development projects with a dedicated build server, we recommend creating a build at least daily. For hobby projects, or projects with a smaller team size, creating weekly builds might be enough. Creating too frequent builds with very little changes in between them will only be a waste of hard disk space, so choose according to your project needs.
Choosing the weekly frequency will give you the most customization options. After selecting this setting, you can choose the individual days of the week on which the task should run. For hobby and mod teams working mainly on the weekend, daily builds on Friday through Monday might not make so much sense. Development teams working fulltime will probably want to choose a setting of one build daily during the work week, but none on the weekends.
In many development studios, the build server will also host the version control system, internal documentation, and bug tracking services and potentially serve as a network drive.
To prevent the automated builds from causing a performance hit on the server during work hours, it is sensible to schedule the task at a time when it is most likely that no one needs the server for other purposes, such as during the night or the early morning hours.
After the frequency has been set, we can define what action should be performed when the task is executed. Choose Start a Program from the list and click on Next.
The next page of the wizard will ask you to provide the program or script that should be run. Use the Browse button to select the
BuildWithLogFile.bat script. Also, set the path that contains the build scripts in the edit field labeled Start in.
Hit Next and Finish to finalize the process. If you want to test your task, you can either set the frequency to a one-time run a few minutes from the current time, or manually select the task from the Task Scheduler Library and then Run from the right-click menu.
The automated task will create clean builds of your project at the set frequency for as long as you keep the task active in the task scheduler. The builds will be collected in the build archive folder you specified in the script. From time to time, you might want to go through that folder and delete older builds or archive them to another storage location.
Don't delete all of your old builds as they can be extremely handy at the end of your development when tracking down resilient and hard-to-find bugs. Keep at least one build from every month's development to help you narrow down timeframes when a certain bug first appears. This makes the process of finding the culprit changelist a lot faster.
CryENGINE is a high performance 3D engine, which is able to render beautiful scenes at a high frame rate. However, there is always a chance that certain changes or additions made to your game project will affect the performance negatively. A line of badly written code or a few un-optimized assets might lower your frame rate considerably.
Being able to catch those performance issues early on is very important. The sooner a performance issue can be identified, the sooner it can be fixed.
Automated performance tests can help you spot those issues as soon as they appear.
CryENGINE has the ability to generate so-called level statistics files, which contain a lot of information about the performance situation of the level. By reviewing and comparing those files, it is easy to identify any existing performance problems. The level statistics file for a level can be generated manually using the
savelevelstats console command. Let's give it a try and generate a level statistics file for your level.
Open the level you want to profile in Sandbox.
savelevelstatsinto the console.
Depending on your computer and the size of your level, it might take several minutes to generate the level statistics file.
The higher the object count of your level, the longer the process will take. The
savelevelstats command will collect detailed information about every object in your level, hence the higher the object count, the longer the process will take.
Once the statistics file has been generated, let's open it up and have a look at the details.
Go to the â¦/TestResults folder.
You can see that the statistics files have been generated as
.xml files. You will need to use MS Excel to view the files. If you do not have access to MS Excel, you can just download the free Excel file viewer from http://www.microsoft.com/en-us/download/details.aspx?id=10.
Now, let's have a look at what has been generated. Take a moment and browse through the information contained in the files. You will see that there is a lot of relevant information contained in the documents. Everything from the overall loading time of your level up to the polygon count of individual assets can be found. You can use all this information to create a comprehensive picture of the performance situation of your build.
A more detailed description of the individual entries in the level statistics file can be found in the official CryENGINE documentation found at http://freesdk.crydev.net.
You might have noticed that the level statistics files we just generated, although being very comprehensive, only show a snapshot of the performance situation of our level. They provide us with performance data from just the point in time they have been created.
Only collecting performance data over a longer period of time will allow you to see certain trends and developments.
Once your build scripts are set up to generate the level statistics automatically, you could also combine the gathered information in one easy-to-read document.
One of the bigger challenges when working with the CryENGINE codebase is integrating new versions of the CryENGINE SDK. Crytek's engine team is constantly changing and updating the engine. New features get added while others get optimized or updated. New versions of the SDK are released multiple times per year.
Although there is no need to upgrade and integrate those new versions into your projects codebase, you usually want to upgrade sooner or later, since the updates contain many new features and optimizations which will benefit your game.
Depending on the size and scope of your project, integrating new versions of the SDK can be anything from easy to very difficult.
In the next paragraph, we will be discussing the best possible way to integrate new versions of the CryENGINE SDK.
When incorporating updates into your build, you will need to merge the changes made by Crytek with the changes made by your development team. This process is called integration. Doing this manually is no small task even in a small production environment, and becomes nearly impossible in larger teams due to the sheer number of changes made, especially to the source code.
A version control system can make this process a lot easier, as it comes equipped with integration and merging tools. If versioning was used from the start of a project, the software will have tracked a history of all changes and know how to merge them together. We will briefly describe how to set up your repository to make integration of a new CryENGINE release as easy as possible.
When starting with a new project, the first repository you should create is one for the unmodified CryENGINE version, the so-called vanilla build. Unless you are planning to use the assets shipped with the SDK, this repository should only contain the
Libs folders from the release.
Do not check in any of your project changes into this repository. This depot will only serve to track the changes between the individual releases of CryENGINE.
Create a new branch from the engine depot that you created. The steps to create a new branch are different in every version control system. In Perforce, you can create new branches directly via the GUI. Right-click on the depot folder containing the engine and choose Branch Files.... Then specify a name for the new target branch and confirm by clicking on Branch. You can now start making your project-specific changes and also check in your project's assets.
If you have already started on a project and don't have a version history, you can still follow this workflow. Check in the vanilla build and create a project branch as described. Then delete the contents of the newly created local project folder and copy your current project into the same folder. Then choose Reconcile Offline Work in the Perforce GUI to create the version history.
When a new version of CryENGINE is released, you will need to perform two steps. First you need to update the engine branch in your depot and then you need to integrate and merge those changes into your project's branch.
The cleanest way to update your engine branch is to delete the local folder and replace it with the new release. Then let the version control software run a check on the differences. This is done differently in all the software. In Perforce, you can right-click on the engine depot folder in the GUI and choose Reconcile Offline Work. This will generate a changelist with the engine update.
Once this changelist is checked in, you need to integrate it into your project's repository. With Perforce, this is done by right-clicking on the engine branch folder and choosing Merge/Integrate. Choose your project's depot as the target folder and select Merge. This will generate a new change list with the relevant changes. Right-click on the changelist and select Resolve Files to merge the changes between your project and the engine update.
Before you can check in this changelist, you will need to resolve any conflicts. A conflict happens when the version control software doesn't know how to merge a file that has been changed by both your project team and the CryENGINE update. This can happen if the same line of code in a source file has been changed, for example. In this case, you need to manually merge the file. If there are any conflicts in your changelist, you will get a message window after the preceding step gets resolved.
Having quality assurance (QA) procedures in place for your game project is important and will help to keep an overview of the problems and issues in your game. No matter how thorough your work is and how prepared you are, there will always be bugs. The quality assurance part of your production pipeline is not supposed to make sure there are no bugs, but rather help you decide how to manage the bugs you will encounter in the course of your project.
The QA workflow of your production pipeline should be tailored to the size of your project. A small two-man team working on a CryENGINE mod will require a different approach than a 100-man team working on an AAA title. Depending on your team and project size, your solution could be anything from simply writing bugs down on a piece of paper to entering bugs into an online bug tracking system.
First, let's have a look at how professional large scale teams set up their QA pipeline. In those types of production environments, found in most AAA studios today, a bug will go through many stages before it is finally fixed.
You can see that this process is rather complex and involves many stages and possibly various people being involved.
The benefit of this process often employed by larger-sized teams is that it is very thorough. Working this way will make sure almost every bug will get caught, documented, and fixed.
This is a good approach if you work with a development team of a larger size and need to handle larger amounts of bugs. The bug tracking system you decide to use will depend on your specific needs and your available budget.
If you are working in a smaller team, maybe with just one or two other people, your QA approach will be a bit different. It might be sufficient for you to just write down the issue in an issue tracking system and then revisit and fix the bug at a later time.
Simply saying "I'll remember this bug and will fix it later" might sound easy but is not practical. No matter how thorough you are, if you do not track your issues properly, you will eventually end up shipping your game with them.
When you create reports about new issues and bugs for your project, it is important to include as much relevant information as necessary. This will make life a lot easier for the person who has to fix the issue.
A badly written bug report can cause confusion and cost additional time. When you enter new bug reports into your bug tracking system, make sure you include all the information that is relevant in order to identify and fix the issue.
Relevant information for a CryENGINE issue includes the following:
CryENGINE build and version number
System specifications (hardware, DirectX version, and so on)
The crash log and call stack if available
The level name related to the problem (if applicable)
In addition to this, the issue should also be described as accurately as possible. The more information included, the easier it will be to fix the issue.
In this chapter, we focused on the different aspects of the CryENGINE production pipeline. You learned about the importance of planning and setting up essential elements of your pipeline including version control systems, automated builds, and quality assurance processes.
Following the guidelines and techniques detailed in this chapter will make it a lot easier for you to set up and maintain your project. You will be ready to scale your project with your team size, integrate engine updates, and stay ahead of any potential problems.
Remember that a stable and robust production pipeline is one of the most underestimated, yet most important components of your project.
Now, that all this preparation work regarding your pipeline is taken care of, things will get a bit more hands-on in the next chapter. You will get an introduction to the CryENGINE input system and learn how to modify and set up your game controls and input methods.