When we launched in this project, the only thing we knew was that we wanted to highlight the usage of Substance in industrial design. We needed to figure out everything else. I had to answer a foundational question: what message did I want to transmit through this project, and how could I do it?
When I need to crystallize an idea, I always begin with a piece of paper. A brainstorm session with post-its helped me decide to focus on the relationship between humans and technology.
I believe that connected devices change the way people think, and that both feelings and emotions can be influenced by technology. I wanted to turn the tide and push them to the foreground of my design.
The association of feelings, emotions, and textures was obvious for me, because they are the principal link between Substance Painter and the artist.
With all those notions, I developed a brief to demonstrate Substance Painter’s power through an intuitive and non-intrusive product. I was highly inspired by raw materials and non-linear shapes because they symbolize nature and are the perfect illustration of what I had in mind. I spent several days sketching, going into many different directions, each with completely different ideas.
Sketches and NURBS
At the very beginning, I wanted a product with many natural materials to show the ability to make complex textures inside Substance Painter. After hours of sketching, I decided to refocus my attention onto something more essential: the relationship between the product and its environment. That’s how the watch concept and the exploration of textures changing in real-time was born.
I ended up doing a lot of back-and-forths between sketching and modeling. I love drawing, and using NURBS or polygons to validate the true impact of the shape in the 3D space, that’s one of the strengths of using sketch modeling.
My main 3D modeling tool is Autodesk Alias. NURBS is an obvious choice for me, because it allows me to validate or eliminate ideas very quickly, while keeping beautiful curvatures and reflections. I can easily play with curves and surfaces to create what I have in mind. It also allows me to easily communicate with industrial designers, exchange CAD data, and work with others all along the design process.
My goal was to keep the model as clean and as simple as possible, because I knew I wanted to create the details later in the textures.
Once I had a clean 3D model of the watch case, I switched to Autodesk Maya to make the watch bands. I always prefer using polygons for soft elements because I don’t need beautiful reflections, and it’s easier to texture, distort or animate that way.
The leather band was more complicated to build than the fabric one because I wanted to be able to deform it. To do that, I used a Non-linear Bend Deformer in Maya.
Once the two bands’ models were complete, I went back into Autodesk Alias to model the band clasp. I imported the Maya-created leather band into Alias to ensure a perfect match of the two elements.
File preparation and UV unwrapping
Working with NURBS can be very useful — but also very frustrating, especially when it’s time to export for CGI because of UV unwrapping. I didn’t find the perfect way to get it done. In any case, the very first step had to be this: orient the normals and stitch my 3D model in Alias in order to convert it to a clean polygon mesh, which would let me make my UVs in Maya properly.
One of the potential solutions would be to use the NURBS to Mesh tool, then export an .fbx from Alias to Maya, but this method didn’t seem to work properly. When the model comes from NURBS, Maya keeps shells when unfolding complex meshes or, even worse, it completely breaks the mesh when trying to fix non-manifold errors.
Since I didn’t want to use texture projection, clean UVs were essential. After many problems such as shells with different UV spaces, impossibility to seam edges, non-manifold geometry or exploding mesh when trying to fix it, I decided to go for another option.
The best solution I found was to make a bridge using Autodesk VRED to properly unfold my UVs. VRED has a really nice UV editor that allows artists to manually cut UVs and unfold a NURBS-generated geometry to prepare the mesh in order to be processed in Maya.
After exporting the watch case as an .fbx file, I went back to Maya where I was able to properly refine the model’s UVs. If I had been using only one UV space, I would have been disappointed by the quality of texture details, so I ended up making UDIMs (multiple UVs) to get a higher pixel density.
Substance Painter now allows users to paint across UDIMs, so that was one more reason to go beyond one UV space.
Watch body texturing
With my 3D model now ready to be textured, I exported each element from Maya into multiple .fbx files. Then I imported them in different Substance Painter files, because my graphics card (GTX 980) was not powerful enough to support the entire model in a single file, especially if I wanted to work in 4K resolution.
Thankfully, during the process, the NVIDIA team let me use a Quadro RTX 6000 card. Thanks to this powerful graphics card, I exported and reimported the entire watch into a single Substance Painter file. It made it way easier to achieve the full texturing.
I worked with a PBR – Specular Glossiness workflow because I was most comfortable this way. I began with the body of the watch. The challenge was to add details only using the normal and displacement maps, and the shader settings helped me by letting me visualize in real time how the height map influenced the 3D model.
Every detail was made using a non-destructive workflow: first, I created a fill layer related to the principal material, then I added a mask on which I painted a pattern, and finally I added effects like blurs and fills. This method allowed be to go back to the initial material and modify textures and colors without having to erase and restart everything.
It was very easy to place text and logos using masks and the alpha channel. I could also get a great previsualization in real time of how it would look like on my final renders.
Leather band texturing
Both watch bands were a bit more complicated to texture. For the leather version I wanted a detailed and realistic leather adapted to close-up views. I used materials from Substance Source: a brown Leather Turned material for the interior and a red Bull Leather Medium Grain material for the exterior.
I also added text and bumps as finishing touches.
That’s one of the areas where working with UV tiles paid off: UDIMs gave me a high pixel density and helped me get details that I couldn’t have had with only one UV set.
What’s more, in Substance Painter, you can paint across UDIMs: this function helped me a lot to create realistic stitching without any previous modeling and with a perfect continuity.
Fabric band texturing
The loop band texturing was one of the most difficult things I had to do in this project. The gigantic Substance Source library helped me a lot to find and make the exact fabric texture I was looking for, but it took me hours before achieving something I was happy with. From a distance, textures were fine but for close-up views, I always needed more and more details.
Finally, after playing with UVs and trying many different textures, I found what I was looking for.
I chose to use the Wool Woven Fabric texture from Substance Source, but I also added fur using XGEN on Maya to give it that extra realistic touch.
My watch was very simple and pure, so in contrast, I decided to push the environment further: I wanted to surround the object with raw nature references. I went back to Substance every time I had a new idea or something new to texture.
Sometimes, I needed to manually modify textures in Maya to exactly match what I had in mind (for instance, to add specific refraction, emission, displacement animation, and more).
In this project, I exported the textures from Substance Painter into a 4K resolution in 16-bit .pngs, because 8 bits weren’t enough to get a good level of quality, especially for height maps used as displacement.
When you export, Substance Painter blends height information into the normal map. To avoid this, I had to group my normal information in order to hide it, and then export normal and height maps separately.
Back in Maya, I imported maps from Substance Painter. Even if there’s a really nice plugin to do it, I preferred not to use it because I wanted to set up everything manually. That way, I could control and refine each texture if I wanted to.
Since I’d been working in the UDIMs workflow, I exported the diffuse, specular, glossiness, and normal maps in several parts; that’s why once I was back in Maya, I had to check the UDIM option as UV Tiling mode. For the height map, I created VrayDisplacement nodes and I played with Subdivisions and quality until I had a good result.
I often use a VraySun but I can use another type of lighting as well: it depends of the feeling I’m trying to convey in the final render. If I want natural lighting, I only use 3 or 4 lights, but if I want complex studio lighting, I use the Maya Light Linking editor.
I also love to use a VrayLightDome to add details and natural light impacts on the object, as well as a VraySky.
To simulate natural light reflections, I played with ramps and noises directly plugged into the lights’ textures.
My settings for still images and for animation varied depending of the scenes. I rendered still images in a 4096×2160 resolution and animation frames in a 1920×1080 resolution.
With NVIDIA’s support, I was able to use the Vray GPU RTX engine to render most of my frames. This GPU and its 24GB of memory prevented long computations which helped me save a lot of time.
I mostly used the Progressive sampler which let me set a maximum render time for each of my frames.
Even if GPU rendering has its limits — like displacement use and the treatment of heavy scenes — I have to say that the V-Ray GPU rendering engine was really stable and I can’t wait to use it more. It’s a totally different process than with CPU but it saved me so much time: days of frames rendering for the animation, easier lighting in IPR and really fast material previsualizations.
However, even if a good RTX GPU is becoming an essential and vital component for every 3D artist, I think that it’s very important to have a good CPU dedicated to high resolution rendering and heavy scene calculation.
As usual, I used render passes to make the post-production step easier and smoother.
Usually, I will use After Effects for post-production and Premiere Pro for editing but this project had too many heavy frames. To avoid big file transfers, I decided to stay in After Effects during the whole compositing process.
The first thing I did was import animation frames in different pre-compositions. Then I compiled V-Ray Render Elements combined with masks, fusion modes and effects, depending on what I needed for each shot of the project.
Once I was happy with the result, I dragged the pre-compositions into the main composition window to start editing. The animation never stopped evolving throughout the project, and it developed along with its Behance page — more on that later.
I worked on still frame compositing in Photoshop using the same process.
Post-production is a great way to enhance your final result, but it won’t solve problems like poor modeling quality or bad lighting. That’s why I had to be careful all through the process.
Finally, as I love producing music, I designed the video soundtrack using my Native Instruments Komplete audio soundcard on the Presonus Studio One 5 sequencer. I also used audio plugins like Absynth, Kontakt and virtual instruments. Everything was composed and recorded using a midi keyboard.
Behance has always been a great source of inspiration for me because of its many talented artists publishing their works on the platform, but it was also at the heart of the project, from the first ideas to the final result.
The storytelling was created via text of course, but also through the images. In Behance, I could easily replace images in a very clean and simple interface. For me, it is not only a nice website to show my work, but also a very effective working tool. Behance helped me map the story of the project, writing everything I had in mind and putting renders together to see what the final result looked like, as well as what I could or couldn’t do.
Jean François will be live with Vincent Gault on March 2 2021. Check out the livestream here:
Special thanks to Pierre, Marine and Damien from Adobe. Thanks to Antony and Guillaume from NVIDIA for the support, and thanks to Phil from Autodesk.
See more of Jean-François art on Instagram and on jf-bozec.com.