Today we talk to Andrew and Jon from Red Storm Entertainment . They’ll give a detailed account of how the team used the Substance toolset and custom in-house scripting tools to create a rock-solid texturing workflow for Tom Clancy’s The Division 2.
Andrew: Hello all! I’m Andrew Dixon. I’ve been playing around with Substance Designer since Substance Designer 4, but never got heavy into it till about 2016. I wanted to say thank you to the community at large. It can be difficult to learn a new program for the first time but both the forums and Substance Share have been super helpful in my own experience in learning Substance Designer!
Just please, if you create a thread asking for help or have an issue, and you reply with “I fixed it,” please post the solution! You’ll save me and many others sleepless nights.
Jon: Hey everyone, I’m Jon Lauf. I’m a technical artist and have been working in the video game industry for 20 years. I started out at Treyarch and have been working at Red Storm for the past 7 years, most recently on The Division 2. I’ve been working in Substance Designer and Substance Painter for the last few years helping develop tools and pipelines where needed.
Roles on The Division 2
Andrew: My actual title is Senior Environment Artist, but for The Division 2 I was a full-time Material Artist! It was really exciting to be able to specialize in something and really grow in that field.
Jon: My title is Lead Technical Artist at Red Storm.
Substance in production
Andrew: As I said earlier, I didn’t dig into Substance Designer till about 2016—heck, I even have a few Substance textures in DLC for The Division 1. Man, it’s come a long way; I don’t want to go back to a life without Flood Fill. During our pre-production for The Division 2, one of our main pillars was to increase our generic texture fidelity and variety. We already had artists in multiple studios making the switch over from Photoshop texturing to using the Substance toolset, so the transition was actually pretty clean.
The challenging part was figuring out how to utilize the Substance ecosystem, and what workflows needed to be changed to take full advantage of these new programs. Last I counted, we had around 450 Substance materials. Some are wholly unique, while others are variants of existing ones. It’s hard to believe we created that many materials. I don’t think we could have made this much variety without the use of Substance Designer.
Jon: I’ve been using Substance Designer and Substance Painter for the past few years. I love the flexibility both programs provide. One of our main goals on The Division 2, from a technical perspective, was to keep the material library consistent and within memory constraints. Having a dedicated material team working solely in Substance Designer really allowed us to create a cohesive look across all materials and iterate quickly. Additionally, we built a pipeline using python around the Substance batch tools, which allowed us to keep our materials and custom node library clean and up to date.
Visual art direction
Andrew: With this being a Tom Clancy game, it is very much a photo-realistic art style. Though being in third person, we did have to exaggerate some features of a texture for it to read better. A lot of the help with Substance Designer came from the ability to quickly iterate on textures.
One of the pillars for our art direction was overgrowth and decay. We had something called “the state of the world.” The question posed to us was “What does the world look like after having not been maintained or used for over seven months?” This question dictated the amount of dirt, dust, grime, and overgrowth a texture had. We didn’t want to go as far as something like “I Am Legend,” but we still needed to tell a story.
It took us a good couple months to really drill down how much is too much, and how little is too little. Substance really helped with this process. We were able to quickly iterate on textures with near final results in a matter of minutes. This was changing things like the amount of plant overgrowth, or the density of dirt across a surface.
Our use of the Substance toolset
Andrew: I think we ended up using everything, even Substance Player! Substance Designer was used mostly for generic tiling textures with Substance Painter being used by our prop artists.
Jon: As Andrew said, we pretty much used the entire Substance toolset. We used Substance Designer to create all of our tiling textures and Substance Painter to paint custom maps for props. We also had a dedicated pipeline for Substance Painter that allowed artists to paint masks in Substance Painter for some of our more elaborate custom shaders. I’ll go more into this pipeline in a bit.
Additionally, we used the Substance batch tools in conjunction with several Python scripts to help maintain our library by batch updating custom nodes to the latest version of Substance Designer when we upgraded and also by creating a tool that would ensure all of our graph’s reference (textures and nodes) we’re checked into perforce to ensure whenever someone opened a graph it would not be missing anything.
As Andrew mentioned, we also used Substance Player quite a bit. We created a custom “grime” Substance Designer graph for our modular building pieces, which would take several mesh maps (position, world space normal, etc.) and spit out a grime map for that geometry. We used that graph in Substance Player so artists could tweak the grime for their building pieces before exporting the final map. This sped up the production of these maps significantly.
Andrew: The material I’d like to show is our Exposed Tile Floor. This is probably one of my favorites on a technical level.
This is our exposed tile material.
First I like to gather as much reference as I can. One of my coworkers was remodeling his back porch at the time and sent me a bunch of pictures. If you’re curious, I use PureRef to organize and view reference images.
This material is showing the exposed mortar on the substrate floor. In the reference you can clearly see where each tile layed and how the person laying out the mortar made different passes for each tile. To get this effect we need to start off by creating our initial tile floor layout.
I’ll start off with getting my basic information I’ll need for the future. Using a tile sampler, I’ll set the tile size to .99; this leaves a small gap between each one (alternatively you could make the tile sampler produce random grayscale tiles and use an edge detect instead). With that, I feed it into a flood fill to get individual information about each tile. I get several maps like gradient map and a random grayscale color map. Finally, I feed those into a distance node with the initial tile sampler output to fill in the gaps.
The reason I really like this material is that I found a use for the Reaction Diffusion node, which I initially passed off as a neat thing, but not really too useful. The trick is limiting the amount of information going into the Reaction Diffusion node. I’ll show you in the next few steps!
First I’ll get a vertical Gradient Linear 3 and pass that into a Quantize Grayscale node. This will limit the number of grayscale values coming from the gradient node. Here I also use Perlin Noise add/sub’d into the Gradient Linear 3 to give a warping effect.
I’ll then feed that result into the Reaction Diffusion node to get the mortar lines.
Lastly, I’ll use a node created by another artist on the team. It’s a rotational warp based on a grayscale input. I’ll feed in my mortar lines into it with a grayscale pattern of the tiles. This will give me the look of independent mortar lines per tile.
Before moving on, I repeat that whole process with a new random set of tiles. I’ll be using this as a sort of random grout that goes between tiles.
As a quick trick to get a mask for those random set of grout lines, I ended up using the Quantize Grayscale map from earlier into a rotational warp node. Using a Histogram Scan, I can make a mask that is used to blend between the original set of grout lines into the new more random set.
Now that I have a black and white outline of my grout, I can start to generate more information that will be used down the line. First up is a set of individual grayscale values per grout line. Using two flood fill nodes (one for regular tiles, the other for the random, blended together at the end) I’ll generate those random grayscale values and use a Distance node to fill the gaps in between them.
Now that I have all the information I need, I can start making the height look good! I need to give some randomness to the thickness of the grout lines. Using the random grayscale colors from earlier, I’ll levels and blur that out and feed that into a slope blur with the original height. The random grayscale values will act as a sort of strength modifier per line for the Slope Blur node.
Almost done with the initial height! Next few steps are to blend back in the random grayscale values (from the distance node) into our black and white lines to give them some vertical variance, and then finish off by taking a random grayscale selection of our tiles and multiply that into our height map. This gives the appearance of individual zones of grout lines instead of random ones across the whole surface.
The initial height map is now completed. Now to start roughing it up.
I’ll start by chipping away at the surface. In this pass I use a Crystal 1 node along with a blurred clouds multiplied together to act as a breakup for the crystal. Using a Histogram Scan, I’ll force the previous result to be nearly pure black and white, which will be subtracted from the grout height.
I’ll do another destruction pass but this time with larger shapes. This one is very similar except for one neat difference: Instead of just subtracting the resulting shapes away from the grouting, I’ll instead warp the subtracting shapes using an inverted height map. This will cause the shape to shift along with the height map, so when it’s finally subtracted it’ll fit more naturally into the surface.
Next, I do a similar pass but this time additive, to create a smear-like effect. To get the smear between two grout lines, I use another distance node on the current height to fill the gaps between grout lines. Using this new map, I’ll blend it into the previous height with the help of more noise marks acting as the opacity. Like before, this is done in both small and large sizes.
I’ve found so many uses for the following technique. In this case, I’m using it to create large chunks removed along the edges. This is to simulate the effect of the tiles being torn or pulled up from the surface, taking large sections where they met. This was done by taking the gradient per tile information from earlier, passing that into a Highpass Grayscale, blending it in with a noise before using a Histogram Scan to start isolating the information to the edges of the highpass.
I’ll finish off the destruction pass by adding a layer of cracking across the surface. Getting good random cracks that adhere to the surface correctly can be difficult. This technique is one of my favorites: I start off with a noise, in this case Clouds 02, with a little bit of the gradient tile information blended in to give some direction to the cracks. I then use a Slope Blur Grayscale node along with two BlurHQ’s of the noise. The top input is a lower blur amount than the bottom input. Then I max out the values with an Auto Levels > HBAO > Highpass Grayscale to even out the values > Invert Grayscale and finally a Histogram Scan to isolate the cracks. After this, I pass it through a few warps and noises to really sell it!
Destruction completed! Now onto the medium to micro details.
The last few passes are pretty simple. I’ve added several passes of low level noise across the surface, keeping in mind to break up that noise by either the height map or other existing information to keep it from looking too consistent. Last but not least is a debris pass with tiny pebbles and wood chip looking things — elements to help sell the environment this material has been in.
Final height map.
Onto the color! Color was the hardest thing for me to learn. The first thing I tell new Substance Designer artists is to not rely on a “one gradient to rule them all” mentality. I treat it like I had treated Photoshop texturing, using multiple passes to build up a surface rather than trying to get a gradient from the resulting heightmap. I’ll start off by getting my random grayscale values, passing that into a Gradient Map and start mapping similar value colors to those random grayscale values. In this same pass I’m also blurring my Gradient Map result, using my additive smears as a mask to blend between the blurred and not blurred Gradient Map. This gives the smear areas a washed-out, uniform color compared to the rest of the surface.
I actually don’t touch the color lane for the first half of the height creation. I come back to the color after I have started work on the micro noise/detail. At this point, I take my resulting noise and start creating new gradient nodes to blend in those noises as colors. Out of these three blends, two of them are separate noise passes, where the other is a new noise for varying my hue and value.
To get that surface hue and value variation, I’ll start by using a noise and mapping that to the set of values and colors I want. Normally, the end result of that process would leave you with gradient node ‘rings’, as I like to think of them, around your input noise. Instead, I take a Non Uniform Blur Color and blur the ‘ringed’ gradient in against the original noise to help soften out those rings. Finally, I use a directional warp node with the height to warp my new color map in against the height to help it settle naturally once it’s blended back into the color chain.
Base grouting color.
Next is the substrate pass: the stuff that’s been revealed underneath the grouting. Again, I have multiple blends using the information I already had from my micro detail/noise passes. I’ll take that information, use them in a gradient along with that same noise as a mask to blend it into the color chain. The technique here that I love, and use in nearly every graph, is getting the difference between two points in a graph and using that as a mask.
Getting the difference between two nodes gives you the “changed” result. Anything that has changed between the first node in the series and the last node in the series, from where you get the difference, will be displayed in the final result. This also includes the relative strengths. In this example, I’m using max lighten blending to mix a granular noise into a broader noise. The information I’d like to get would be the areas where the minor noise managers to poke through the broader noise, due to how Max Lighten works. Getting the difference between the start and finish will result in the change between the two, giving me a mask of only the changed areas.
For a practical example, I’m using a Max Lighten to blend the small pebbles onto the surface of my substrate. This means that there will be areas where the pebbles are going to have their strengths reduced depending on the surface below. So if a pebble lays on top of an area that is a greater height than it, it will be cut off. When I go to create color for the pebbles, if I were to use the original generation as a mask, I’d have an inaccurate mask. If I were to use the difference mask, it will be accurate to the height result.
Now that the substrate is done, moving onto more minor details like pebbles and final dirt pass.
Next is a color pass of our pebbles and wood chips. These are done simply by grabbing the difference mask of our generators and height lane, putting those into a flood fill to get random grayscale colors, and then using that into a gradient map to map the required colors to them.
Last but not least a dirt pass. I’ll use the resulting normal map into a facing normal to get a surface mask. Using a grungy noise, I’ll multiply that into the facing normal to get my final dirt mask. We finish off with a touch of curvature overlayed onto the surface and we’re done with the color! The curvature kind of sharpens and accents all of our height details onto the color.
The roughness will be pretty quick and simple. As I’ve gone through both the Height and the Color, I have a ton of information to work off of, and little I need to actually create uniquely for the roughness. At the start, I have a mask for both the grouting and the substrate. Using these masks, I’ll take the relevant information for each, like the noise passes and damage passes, and blend them into each other.
Going forward, I’ll do the same thing I did in the previous step. I’m getting the relevant information for a few more unique noise passes, as well as the pebbles and wood chippings, adding or subtracting them from the roughness lane to get shinier or duller features.
Lastly, we add our dirt pass.
And we’re done!
Substance Painter Sidekick Creation
Jon: Substance Painter Sidekick is a standalone python tool we created to streamline our proprietary workflow with Substance Painter. We had the following goals:
- Mesh setup: Allow artists to import into Substance Painter the same .fbx file they used to import into game engine.
- Substance Painter scene setup: Quickly populate a Substance Painter file with a custom stack, custom shader, and mesh based on a couple options chosen by the artist.
- Texture exporting: Quickly export textures into the engine with a specific naming convention. Additionally allowing artists to automatically composite multiple Substance Painter texture sets together.
The Sidekick UI has two tabs. The first tab handled mesh and Substance Painter scene setup, and the second tab handled texture exporting.
Early on in production, we realized artists were exporting separate .fbx files when importing into the engine than the ones they imported into Substance Painter. Typically they would be doing this to remove certain aspects of the file, such as collision meshes. They would also swap or copy UV channels so the flattened mapping would be on UV1 for Substance Painter. We wanted to eliminate this process so we utilized the python FBX SDK (https://www.autodesk.com/developer-network/platform-technologies/fbx-sdk-2019-0).
The FBX SDK allowed us to remove meshes, manipulate UVs, and change material IDs based on artist input. From there we generated a “hidden” FBX file that would get sent to Substance Painter and deleted after the Substance Painter scene was successfully generated.
Painter scene setup
We utilize a lot of custom shaders in The Division 2 to give assets a very specific look. For instance, our “rust dirt scratch” shader takes in a simple RGB mask where each channel defines where rust, dirt, or scratches will appear on a material surface. This achieves multiple goals: It reduces texture memory, creates a cohesive look across the environment, and allows the artists to work quickly because the shader is doing some of the heavy lifting to create a refined look. In the first The Division game, artists would have to paint these masks in Substance Painter without knowing the final look until exporting the mask to engine. Our goal was to allow artists to see the final result of the custom shader while they painted the mask within Substance Painter.
Because we wanted Sidekick to be a standalone tool, we needed to find a way to communicate with Substance Painter through Python. We did this by opening a JSON port in a Substance Painter plugin and communicating with that port by posting requests through an http connection. More information on the setup can be found in a forum post by “Froyok” here: https://forum.substance3d.com/index.php?topic=14253.0.
After establishing how we would communicate with Substance Painter through our standalone tool, we wrote a utilities class in Python that sent commands to Substance Painter using the Substance Painter plugin API. We wanted to be able to create new scenes in Substance Painter with our custom mesh and shader. This was easy enough using templates, however we wanted to take it one step further and pre-populate the layer stack with an easy to understand structure for the artist. Templates do not support stack manipulation, but we found we could call ‘alg.project.create()’ and send it a Substance Painter scene file as a parameter instead of a template to retain information in the stack.
For texture exporting, our main goal was to allow artists to easily combine textures from multiple texture sets in Substance Painter. To accomplish this, we simply exported each texture set and its associated textures in a file format that supported alpha, such as .png. From there we composited the different textures together using the Python PIL image manipulation library.
All in all, I would say the tool was quite successful. Artists felt like it improved their workflows, and, hopefully, that shows in the final product.
Are there any tips or tricks in Substance you can share with the community?
Andrew: Layer, layer, layer! One of the most difficult parts for me when it came to learning Substance Designer was the Albedo. I kept trying to find the one-stop solution for creating the color, using something like the gradient map node to assign color values to height information using an image. This works maybe 1% of the time and you’ll get an amazing result! But for the other 99% of the time you’re just spinning wheels trying to get it to work. The best solution I found for myself was just to treat it like Photoshop. Just keep layering on more and more color using a variety of masks and height information.
Also, create a scribbles folder. So many times I’ll be working on a graph and accidentally find a really cool combination of nodes. When I find these, I’ll save that portion of the graph to a separate SBS and put it in a scribbles folder so I don’t forget how I did that cool thing. I’ve had a number of cases now where I’ve managed to either solve a problem or come up with a unique look by going back and looking through my scribbles folder.
Any final words you would like to add?
Andrew: Not much then keep on being an awesome community! I love it when I see artists putting out their full SBS files for anyone to download. It’s such a helpful tool to be able to go through other graphs and figure out how an artist did something. I’d highly recommend it.
Jon: For all the tech artists out there: If you haven’t tried the Substance apps, you definitely should. The node-based workflow in Substance Designer is really intuitive, and some of the more technical nodes, like the pixel processor, let you manipulate textures like you would in a shader. Creating textures was never a strong suit of mine, but Substance Designer and Substance Painter have definitely changed that. I encourage everyone to dive in!
All images courtesy of Red Storm Entertainment.