Cyberpunk 2077: A World Full of Substance

CD PROJEKT RED shares the construction of their texturing pipeline

One of 2020’s most anticipated games, Cyberpunk 2077, released just a few days ago. Needless to say that within the first few minutes of playing the game, you’ll see, quite literally, what the hype is all about. Night City, set in a troublesome future, is visually breath-taking and the sheer size of the map makes us wonder how exactly CD PROJEKT RED managed to depict a dystopian future all the while reinventing the cyberpunk genre. We were very fortunate to talk to Technical Art Director Krzysztof Krzyścin, who will take us through the integration of Substance Painter and Substance Designer in the studio’s material creation pipeline.

Krzysztof Krzyścin: Hello community! As a Technical Art Director, I am focused mostly on research and development of art-related pipelines and rendering techniques, both within our game engine, the REDengine, and in all external content creation programs we use in our studios. I am interested not only in giving artists the possibility to push the quality of game visuals but also in making content creation effective and streamlined. For Cyberpunk 2077 we’ve been developing a quite long list of features. Naming a few: multilayered shaders pipeline, procedural geometry systems, weather (with volumetric clouds!), a material library, destruction – including vehicle and foliage, long-distance crowds and vehicles, proxy systems, the world map, almost all simulations, and in-game shaders.

We’ve also developed custom pipelines and tools for multiple content creation programs, some — like Substance Painter and Substance Designer — are deeply integrated into our content creation pipeline, allowing artists to create amazing details for characters, vehicles, and environments.

Cyberpunk 2077 is one of the most anticipated games of the year. How does it feel to work on such a project? 

I like to think about it as a journey, kind of like mountain climbing. In the beginning, you need to prepare, gather the right equipment, and last but not least — choose the right company. Of course, it’s usually rainy, cold and the path is long and difficult, but it’s very rewarding at the same time. Especially near the end when you look back and see how it all started to work together. Of course, the positive reception so far (both from journalists and users) gives us a huge confidence boost that the game nails the definition of Cyberpunk and cyberpunk quite well, and that everybody who’s excited about the game will soon be able to see that excitement realized when they get their hands on it.

What makes Cyberpunk 2077 different than previous games you have worked on? 

The scope of the technical features that we needed to add to our engine to handle such a densely populated — but still freely explorable — vertical city was a big leap forward for us. We had to pack way more detail per square meter, not only due to the first-person perspective camera but also because of the many dynamic scenes that are now happening all around you. We call it seamless gameplay because you — playing as an up and coming mercenary named V — can interact with the world while the scene plays. For example, you can enter a car and have a conversation with a fixer — someone who will provide you with certain jobs — while riding through Night City itself, and seeing the city go by in real-time. Another thing we had to add was player customization options. The amount of customization assets, all the clothing, weapons, and mods — combined with unique hairstyles, tattoos, cyberware, vehicles — this all creates practically an infinite number of appearances for players to choose from.

Building environments also changed drastically for us. The Witcher 3 was full of natural landscapes with a few bigger cities — with Novigrad being the largest. Night City is vastly denser, with many more buildings of different architectural styles. With Cyberpunk 2077, we needed to completely change the way we assemble worlds, assets, and communities. We needed to develop a road system, traffic and crowd management, brand new hierarchical prefab pipeline, global illumination that handles outdoors and indoors, first-person perspective camera, and many more! We knew making this game was going to be a pretty huge undertaking — but we’re all up for a challenge here.

Tell us more about the art direction of the game. What is the visual story you want to convey? 

Cyberpunk 2077 takes place in an alternate, very troublesome future. It’s a future where the world is facing ongoing threats from environmental extinction, global warming, overpopulation, and raging inequality that promotes the rich but forgets the poor. But humanity is good at adapting; evolving. People embraced cybernetic technology to adjust to the dangers around them, and society is still alive and kicking.

From the very beginning of production, we wanted to touch on the visual style from the classic cyberpunk genre, but really modernize it and push the envelope a little bit further. Everlasting rain at night filled with neon lights is an extremely iconic image, and one strongly associated with the cyberpunk aesthetic, but we wanted to make more memorable moments of our own. Mesmerizing sunrises, thick and polluted fog that covers the tops of brutalist buildings, the scorching sun of the desert, the night skyline of the city — and then the districts themselves, where each one has a unique palette of shapes and colors, from glittering neons to harsh glows from abusive advertisements. Our plan was to make Night City itself to be THE character in this game.

Why did you decide to use Substance? How did it help you serve art direction? 

From the early concepts, it became apparent that the scale of the project would propel us to develop new ways to create materials. Quick estimations of the texture memory metrics and the approximate disk sizes were not acceptable for any reasonable hardware. It was a radical move, but we decided to apply the layering concepts to almost all our models, from FPP items like weapons, characters, environmental objects — even terrain nodes. The idea is that we won’t store any unique textures, but rather a library of common materials that will blend in runtime based on a special mask. That mask is still unique for each object, but since most of the time it’s very sparse, we’re able to pack it into a way smaller data footprint. This improves the efficiency of the art pipeline as artists now only need to paint these material layers, rather than compose and flatten multiple textures together as they used to do. This also changes how the meshes are rendered, as each pixel needs to compute and flatten its layers.

Of course, this process needs to be extremely efficient. Layering gives us the possibility to achieve more texel density, with a fraction of the original disk size, effectively packing very high-resolution details into surfaces. After a few iterations both with artists and with our rendering team, we gained confidence that this will work and we started to build our material library. We’ve integrated our subset of tools as a plugin into Substance Painter, and rolled this to a few artists — and they instantly liked the approach. By the time we first presented the first gameplay video, thousands of assets were already using multi-layered shaders, contributing greatly to our game’s visual quality.

Can you give us a quick overview of the different tools used in your pipeline? 

We use Helix/Perforce for data repository and Visual Studio for code and scripts — all that is managed by our own build system. We use Houdini for procedural content, simulations, and special effects. Artists work mostly in Maya / 3ds Max, and our layered pipeline goes through Substance Painter / Remote Browser. Material library textures are done using Substance Designer, all the material templates are edited using an internal app (Material Library Manager). Most of the artist tools are also exposed via custom shelves or plugins so they are accessible from different programs. Game resources are created in REDengine and can be edited by a handful of different editors. And last but not least, for project management we use Jira. 

How do you create your library of materials? Can you describe your use of Substance Designer? 

The material library was created almost entirely inside Substance Designer (either from scratch or from scanned data as a starting point). Each material template consists of a graph that outputs albedo, normal, roughness, and metalness textures — which are tileable in both axes. In addition to creating the materials in Substance Designer, we also produced our various microblend textures. These comprise a heightmap alpha and normal map used to mask the materials. To support the painting pipeline, Mark Foreman (our Senior Environmental Artist responsible for the material library) created a set of custom generators and filters. Because, in general, the masks we produce for our assets are smaller than standard asset textures — and as the detail needed for blending between materials is added through the microblends — it was useful to have some custom generators that produced very simple, and essentially noiseless information to work with inside Substance Painter.

We started from very basic materials like concrete, plastics, and brick walls. As production ramped up, more materials were requested to be added to the library. While we do have some simple surface variations in the material templates themselves, we don’t need — for example — dozens of variations of rusted metal or paisley fabric with color combinations baked in. Instead, we have clean metal and pure rust, or one fabric material with multiple tint options. Combining these with different microblend masks gives us all the variety we need. The bonus here is that the original templates could be iterated on constantly throughout the whole production process without needing to change any assets that actually use them.

Can you go into more detail about Substance Painter and the multi-layer shader system you developed?

Our layering solution differs from the one that is included in Substance Painter, it consists of additional masks (microblends) that include a dedicated normal map that allows for custom, in-between layer blending. Thanks to that we can reduce the mask sizes greatly while maintaining sharp details. It also links its layers to a special material override that pre-defines material variations associated with a given material template. For example, the concrete material has multiple overrides that define a range of environmental weathering, material wetness, and other visual appearances that might affect a generic concrete material. This allows for a high level of uniqueness for our models while keeping the assets consistent across the world.

The custom OpenGL shader made for Substance Painter allows artists to preview the models while they paint. This shader mimics the one used in our engine, linking the same textures to appropriate material layers as you would view the model in-game. We’ve even made a custom tone-mapping solution that matches our in-engine tone mapper to keep parity between Substance Painter and the engine. The engine shader uses similar logic for layer blending but works differently since it uses compute shaders to bake-in visible layers into a runtime texture. And instead of grayscale masks, it uses a special version of what we call data banks — compressed data tiles packed based on their surface coverage area.

Tell us more about the Remote Browser tool you created. How does it streamline your workflow between Substance Painter and REDengine? 

Apart from the shaders, we’ve developed a standalone tool called Remote Browser that works as a bridge between our engine and the Substance Painter/Substance Designer plugins. It serves as an interactive layer editing tool. It can load/unload different material layers and edit their parameters, like color and other overrides, or microblends and tiling. Each change updates the appearance of the opened model (both in our engine or in Substance Painter), tremendously speeding up the creation of different model variations. Imagine swapping bare concrete on a whole building to a painted, covered-with-graffiti version in a few clicks. This tool also manages the process of mask export/import and layers re-assignments if artists decide to change materials later on.

What were the main challenges you faced on this project, and how did you overcome them? 

The sheer scale and complexity of the world was the biggest challenge. We wanted to build up a world that is filled with microscale details and far-reaching vistas. All that with intricate day-night lighting, genre-defining atmosphere, seamless gameplay, and no loading screens in-between. We had to build up the world in a way that we could quickly visualize it from any place, or any time of day, with a handful of different weather conditions. We needed to fill it with believable characters, vehicles (including flying AVs), props, interactive and dynamic scenes, and last but not least — fit all that into reasonable data sizes.

Early on we decided that we’d need to embed scalability to every possible sub-system of the engine. For the most expensive parts, we implemented a long-distance version of the original sub-system. For example, we have a long-distance vehicles-rendering technique (based on simple billboards) that replaces original vehicle geometry, making it possible to see thousands of cars at once. We combined that approach with the hierarchical simplification for all the underlying world geometry, and this gave us the possibility to process the details we want only in the close radius of the player. The first person camera influenced changes in many game aspects, including storytelling, scenes, general gameplay, and interaction. Of course all that is just the tip of the iceberg. And we like icebergs, they are majestic 🙂

Are there any tips and tricks with Substance that you can share with the community? 

Sure, a tip from Marcin Klicki (Senior Character Artist): no matter if it’s a character, a weapon, or a vehicle — artists love to use patterns on them. Carbon fiber, material threads, polished steel, anodized aluminum, having a well organized set of these dramatically speeds up the time needed to finalize the model.

Another from Mark Foreman: first up, work clean and organize your graphs! The way production goes you may need to revisit a material weeks or even months after you created it. A well annotated graph will save you time when it comes to deciphering decisions you made in the past. Make use of custom nodes. If you find yourself repeating node groups often, it might be time to turn those into a custom node that you can reuse in future materials, saving yourself some time in the process. In Substance Painter, don’t be afraid to make your own generators if you are not getting the results you need from the default sets. Hand painting can make sense on individual assets. But once you start working across large sets of assets, having a generator or two that achieves as close to the result as you’d like can save a lot of time, leaving your time free to be spent on the hand-painted details that really count.

What can you say about your overall experience of the use of the Substance toolset on the production of Cyberpunk 2077? 

Looking at the in-game asset count, their diversity, and the details artists managed to squeeze in, I have to say that the overall experience exceeded our expectations. Artists are happy with the workflow and the possibilities it gives them. The pipeline was very stable, effective and bug-free — we had close contact with Adobe developers, and we are looking forward to working with a similar approach on future projects.

I want to thank everybody who contributed to the Cyberpunk 2077 layering pipeline, in CD PROJEKT RED, Adobe, and externally — the list of talented people who worked on it is really long! (possibly even longer than this interview! :))

Meet CD PROJEKT RED

CD PROJEKT RED is a Polish game development studio based in Warsaw. The studio is best known for its highly popular The Witcher series and for creating strong, story-driven games. On December 10, the studio released its latest AAA title, the highly anticipated Cyberpunk 2077.

“Here at CD PROJEKT RED, we are working hard to bring you the best role-playing games on the planet. Storytelling, open worlds, being fair with gamers — and obsessive attention to detail. These are our trademarks. We push for quality, not for quantity.”

Visit CD PROJEKT RED