ExtroForge has a fairly healthy team size (for a hybrid FPS/RTS/Building/Exploration PC game).  We have Engineers, artists, modelers, writers, testers, social media folks, etc.

We decided on a critical montra early on that ALL team members (not just engineers) would have access to the latest build at all times.  It is simply critical for the various tasks that they have to perform that they can see and experience (and capture) the latest gameplay elements, visuals and mechanics.  As ExtroForge is (above all else) a multi-player experience, we also wanted to ensure that a standalone server with the latest code was always available for testing on.

As we were fairly new to the Unreal engine (and there is sparse info out there about any ‘recommended’ techniques), we fell back on processes we were familiar with from using other engines and environments.  With several experienced engineers on the team, we reached into our toolbox to create a process that provided a good amount of automation as well as maintainability.

We hope this information has value for other teams out there!

Source Control

ExtroForge currently uses Perforce (http://www.perforce.com/)  for source control.  While we had started out with git (on a locally hosted GitLab instance), we ran into Unreal-unrelated issues at one point and had the opportunity to jump over to Perforce – recognized as the primary repo of choice for the Epic team.  We were also (possibly irrationally) concerned about the binary nature of most Unreal assets (like Blueprints) and felt that Perforce would be the best solution for us.  We took advantage of the Perforce free 20 users/ 20 workspace licensing and installed on a hosted Windows instance.

Custom Unreal Engine.

ExtroForge has taken advantage of Unreal’s awesome licensing model that allows complete access to the engine source.  We started with a fork of the 4.8 Epic release (https://github.com/EpicGames/UnrealEngine)  and have added some ExtroForge specific changes (currently limited to the procedural mesh elements).  We have successfully merged in 4.9.2 release changes into our fork with no issues thus far.  We keep our engine source in/on GitHub and current integration into our automated build process is manual – as engine changes are few and far between.  The Engine source is ‘pulled’ from git and stored in a local directory on the build server – compiled manually via Visual Studio 2013.  We have not made enough engine changes to warrant automating this process as yet.

Continuous Integration

ExtroForge uses Jenkins (https://jenkins-ci.org/) as our continuous integration mechanism.  Having used it extensively at previous companies/projects, we are well aware of the capabilities (and pitfalls) of Jenkins.  Jenkins polls our Perforce repository for new commits and kicks off a chain of events which results in a standalone client build, a standalone server build – and also launches a fresh server process to allow team members to connect to the server running the latest codebase.   Currently, our Jenkins process runs on the same box as the standalone server process (Windows Server), making the deployment and stop/start of server process easier (although we are no stranger to Jenkins slave nodes) .  Note that we have not yet mastered producing a linux server build from Windows, but the plan eventually is to have the server process run on a ‘nix host.

image01

Essentially, the process starts with the Client Build node.  A key setting for us was “Use custom workspace”. (Under Advanced Project options in Jenkins).  Normally, Jenkins would do its work in a deep directory tree that would often be at odds with Unreal for a fairly silly reason: long path names.  Early on, we would often see build failures because our deep project path, combined with long filenames exceeded some internal Unreal engine limit.  Using very short initial workspace was critical for bypassing this issue.

image07

Using the Perforce Plugin (https://wiki.jenkins-ci.org/display/JENKINS/Perforce+Plugin), we have  a few key settings that have made our process smoother:

image05

 

Specifically, “Let Jenkins Create Workspace”, “Let Jenkins Manage Workspace View” and “Full Wipe”.  The first two are simply because it was easier for us to let Jenkins do all that work instead of establishing the workspace separately using P4 tools.  The “Full Wipe” option was recommended to us by another Unreal team and while it increases build times, it supposedly will cut down on oddball build artifacts and oddities (and failed builds) that can arise.

Our client build node is set to poll our repository every 5 minute changes and kick off a build if something new has been committed.

image03

And finally – we see the most critical aspect of the client build node – the actual calls to the UAT (Unreal Automation Tool) for the compile/packaging itself.  We use the “Execute Windows batch command” option in Jenkins and here is what it looks like:

image04

"C:\UnrealEngine-release\Engine\Build\BatchFiles\RunUAT.bat" BuildCookRun -project="C:\EF-Depot\depot\EFPrototype\EFPrototype.uproject" -noP4 -platform=Win64 -allmaps  -build -clientconfig=Development -cook -stage -archive -archivedirectory=C:\Client-Build

Note the references to the RunUAT.bat file from our own custom engine source directory.  Also note that the resulting files (executable and cooked asset files) are placed in the directory denoted by “-archiveddirectory”.  Finally note the lack of a “-pak” option.  We currently choose to have our resulting client assets be in separate files – as opposed to a single large pak file – for quicker sync for our team members (see later in doc).

Our client build currently takes about 30 minutes. (UPDATE:  We recently turned off the “Clean workspace before each build” option – and our now 45 minute builds have reduced to about 7 minutes.  We are ever-alert for issues – but so far, so good!)

Our team is a heavy user of Slack (https://slack.com) and we also have the Slack Notification Plugin (https://wiki.jenkins-ci.org/display/JENKINS/Slack+Plugin) configured to alert us on Slack of build start/success/failure:

image00

Standalone Client Sync

We want that resulting client build available to our team members at all times.  Because a new build COULD run every 30 minutes or so, the files in the build directory are constantly in a state of flux (and are referenced from the server build as well).  So as part of the client build, it triggers a downstream build node to copy those files to a safe and stable location for further action.

image02

Essentially, we delete the contents of that stable client build folder and copy things over from the most recent client build.

Our team members are then able to sync that client build folder to their local PCs.

Because our build server is running Windows, we couldn’t use a traditional rsync process – but we got as close as we could.  On the build server, we run a nifty little tool called DeltaCopy (http://www.aboutmyip.com/AboutMyXApp/DeltaCopy.jsp).  This essentially acts as an rsync service which exposes this stable client folder to our team members.   We’ve packaged up a little tool that they can use (which uses cygwin and rsync) to sync that folder to their local boxes.  Before we created this process, each team member had to copy down 2.5 gig (the current client size) to their local PC – taking huge amounts of time and bandwidth.  Now, by using the slick options that rsync provides, only changed files are synced – even doing binary patching as appropriate.  Our team members can now sync the latest version in mere minutes.

The batch file used looks something like this (You should read the DeltaCopy and rsync readme/docs for further information on how to configure on the server and what these values mean):

set CLIENT_DIR=/cygdrive/c/latest-extroforge-client
set RSYNC_PASSWORD=syncpassword
rsync.exe --checksum -avzP "username@buildserver::ClientBuild/" %CLIENT_DIR%

We have wrapped the tool up with a single batch file that they keep an icon reference to on their desktop – and any time they want the latest client (or were notified via Slack update), they can double click and be on their way to Sorteria in mere minutes!!

The Standalone Server

As a team-based multiplayer game, ExtroForge demands that we constantly test and play together on a single server.  Part of our automated process results in a build of a standalone server version of our game as well as launching it for all team members to access.

The server build node in Jenkins is pretty vanilla (no perforce integration required) – it is simply kicked off after a successful client build.

Here is the meat and potatoes of the server build node:

image06

"C:\UnrealEngine-release\Engine\Build\BatchFiles\RunUAT.bat" BuildCookRun -project="C:\EF-Depot\depot\EFPrototype\EFPrototype.uproject" -noP4 -platform=Win64 -build -clientconfig=Development -serverconfig=Development -cook -stage -pak -server -serverplatform=Win64 -noclient -archive -archivedirectory="C:\Server-Build"

Again, it runs the UAT tool from our custom engine folder.   The final steps are copying those server files to a stable directory and stopping/starting the game server process

start Taskkill /F /IM EFPrototypeServer.exe
XCOPY "C:\Server-Build\WindowsServer" "C:\Unreal Server" /S /y
start "" "c:\Unreal Server\EFPrototype\Binaries\Win64\EFPrototypeServer.exe" 

Conclusion

We didn’t find a lot of information out there about the Unreal build process (outside of the editor) and had to learn a lot of this by trial and error.  We are glad to give back what we have learned to the community and would be happy to answer any questions.

Our process may not be perfect and it is evolving every day.  Eventually we will probably create a gradle or ant script to encompass all these tasks and have jenkins use them.  We are also are working on parameterized builds and build number integration, etc  – but it currently suffices for our needs as we grow our team and our project and get closer and closer to a public release.