In this blog post we are doing something a bit different. Our tech team will offer some insights into how we tackled texturing issues alongside Graphine in order to make sure we managed to run the game on a wider array of graphics hardware while still maintaining visual quality. While this is primarily a post targeted at other developers, it might be interesting to a wider audience to get a glimpse into the development process and challenges.
In Conan Exiles we use a large number of high resolution textures (4K, some 2K, even a couple 8K ones). We have a pretty large open world, and numerous items, placeables, building pieces and characters, which all adds up to a significant amount of memory. Disk space is not a very critical resource, but VRAM, or shared memory on consoles being in very short supply, we knew early on we were going to have issues with getting this to fit in memory. As expected, a few months from our early access release we ended up with blurry textures on machines with 2GB of VRAM, and we would either fail allocation of GPU resources entirely on 1GB VRAM machines, or suffer from massive stalls as memory was swapped in and out of system memory. Even on 4GB VRAM setups we would still notice the odd blurry texture.
Above: [Blurry textures on 2GB VRAM machines]
Our options were:
On paper, Granite sounded like exactly what we needed, not only because it would allow us to get better visual quality on lower end PCs, but it would also allow us to limit our texture pool size significantly, which on consoles where you only have around 5Gb of shared system and video memory at your disposal is extremely valuable. In practice, there was a major blocker: the majority of our textures and materials were already created and setup in Unreal, and Granite required data to be setup in a different way, through their tool, which would convert the textures into their steaming tiled format.
To resolve this issue the Graphine engineers accelerated the development of a new import workflow into Unreal which uses material graph nodes in materials / master materials to find and convert into their format all relevant source textures. This made it possible for us to get it integrated and working with only limited asset work.
The new process involved replacing in our Unreal materials the regular texture sampling nodes with these new Granite texture streaming nodes, which take as input the same source textures as the original sampling nodes. This does not make the material use Granite immediately (there is no generated Granite data yet), but allows the material to be “baked”, which in turn will make the shader use the Granite VT sampling code, as well convert the relevant textures to the Granite format.
The Granite streaming nodes can take multiple textures as input (up to 4) as long as they use the same UVs. During the bake process, each of these texture stacks (or Granite Nodes) are tiled and the tiles are compressed, before being stored an intermediate file (GTEX), which can be cached to accelerate future bakes of these textures. This data is then used to generate paged files (GTP) which contain a number of nodes. The size and setup of these pages is chosen to get most optimal streaming performance.
The bake process also modifies the material UAssets to use the Granite VT data that was generated.
Luckily for us, given our material instance count, the system supports editing only the master materials, and baking all instances in one operation. There are however a couple of caveats:
At this point we need to go over very briefly our process for making versions of our game:
Source code for our game is stored in a Perforce server. Data is stored in a custom source control system we call “dataset system”.
We have a build system comprised of bunch of machines which will on demand compile our game, run an Unreal cook of the data, sign the binaries (and a few other steps) before submitting the resulting packaged game to perforce and labelling it with the version number. We then have some scripts which will fetch this and push it to steam for testing and eventual release.
The original idea for granite was to have the artists bake the materials they edit, and submit to source control the intermediate GTEX files if changed, but not the actual bake result data. The build system would then when packaging a version fetch those GTEX files, and bake all assets, using the resulting granite data when cooking the content. The hope was that the GTEX intermediate files would make the bake process fast enough to be able to run it every time we publish a version.
Unfortunately, this did not work: artists did not want to deal with baking materials locally, as it could take minutes and was an error prone process. On top of that the Bake All process was still much too slow, even with the GTEX files available to be run for every publish.
Our main goal was to not add any overhead for the art team, and to not add any significant time increase to our build process. To achieve that, after some back and forth we came up with the following workflow:
Figuring out the best way to handle this took a couple of months for one of our engineers and our tech artists, but it allowed us to relatively transparently to the bulk of the development team integrate Granite extremely late in development and solve our issues with texturing. There is of course a lot more to cover on the topic, especially regarding how we updated our materials to get the best balance of visual fidelity, performance, and memory usage. We might try to cover those in a subsequent post. In the meantime, if you have any questions feel free to ask!