Real-Time Ray Tracing is the Future: NVIDIA’s Project Sol Textured with Substance

    With over one million views on YouTube, NVIDIA’s Project Sol has received high praise from the 3D community and highlights what is now achievable with real-time ray tracing techniques and the power of NVIDIA RTX. Here, we talk with the talented artists from NVIDIA who worked on Project Sol, as they explain what this demo shows and how they used Substance in the process, including a complete video breakdown on texturing one of the assets in Substance Painter.

    Introductions

    Gavriil: Hi, my name is Gavriil Klimov and I’m a hard surface designer and art director. I’ve been working in the industry for about 10 years, in roles ranging from movies and cinematics to product design.

    Jacob: Hi everyone, my name is Jacob Norris. I’m an environment artist who has been working in games and 3D graphics for almost 10 years. Some people may also know me from my work at PurePolygons, where I create tutorials and 3D assets that I sell and share with the game community.

    Gavriil: I came to NVIDIA as I’d been freelancing for them for part of 2014, and most of 2015. I ultimately joined full-time in 2015. Shortly thereafter I was tasked to assemble an art team that would be responsible for creating original NVIDIA content. Project Sol is one of our early products. Expect more in the future 😉

    Jacob: In terms of my background, I’ve worked for many AAA developers on some really fun and successful game titles, gaining as much experience and knowledge as I could along the way from some incredibly talented co-workers. I also managed to take on additional work in terms of freelancing, personal projects, and mentorship opportunities whenever possible. NVIDIA originally started as a freelance opportunity for me in 2016, and eventually became a full-time job at the end of 2016. I’ve been here full-time ever since, and I’m very happy with the strong team we’ve formed, and the future projects we’ll be working on together.

    Our Roles on Project Sol

    Gavriil: I’m an art director, and this was my role on Project Sol as well. When the project was in its early stages, I spent some time doing the initial concept designs to kickstart the overall look and feel of the cinematic. I did some 3D sketches for the interior room as well as some sketches for the servo arms. At that point, the rest of the team started to work on the design as well. Eventually, I ended up designing the whole interior room where the assembly takes place.

    I always want to make sure that we’re all working as a team. I strive for overall design cohesiveness and clarity. This applies to all the stages, design-wise, but also everywhere else in the pipeline such as for textures, VFX and lights. Everything flows and connects together; if something goes through a certain change in the pipeline, sometimes you need to go back to fix its original core to make sure the updates reflect the new intent. I closely followed the development of the entire cinematic in all of its aspects and the team did a fantastic job, particularly considering the tight deadline we had. Everyone brought their ‘A’ game.

    Jacob: I had to wear many different hats while working on Sol, because we were still just a small team of artists at NVIDIA working on this project. I worked with Gavriil converting and optimizing his high-poly modeling work as well as converting high-poly work from Gregor Kopka and Ilya Shelementsev for the environment to the in-game assets in the final cinematic. Early on in the project, I began setting up the pipeline for creating all of the smart materials and shaders used on the assets in Substance Painter.

    The lighting, post-process setup, and asset placement within the level was also my responsibility on the project. We have a very open policy in terms of feedback. So it’s great if someone has some comments on my environment work or sharing with other team members. It’s always the best way to get better as an artist I feel. So sometimes myself or Gavriil would also give feedback to Fred Hooper to help define the look of the VFX within the environment. Which turned out awesome and if you haven’t seen Fred’s work before, I’m sure actually that you have and you just didn’t know it haha. David Lesperance joined the project midway through. He was a huge asset in helping to bring the scene to a final state and texturing the assets with me in Substance Painter, as well as many other various contributions within UE4, optimizations, modeling, and to the final artwork in the video. I contributed a very small amount of camera work to the project, but most of that was completed by the very talented animator Brian Robison, whose contributions were invaluable to Project Sol. Lastly, I completed a “per-shot” lighting pass to make the visuals shine as much as possible and really showcase RTX.

    Acclaim From the Community

    Gavriil: I thought, and hoped, the cinematic would be warmly received due to the huge leap forward in graphics that’s presented. Basically, all the ray-traced content out there is rendered offline right now. Sol runs in real-time while being fully ray-traced. Lights. Reflections. AO. GI. Everything is ray-traced.

    Turing represents the biggest shift in graphics technology we’ve seen for a long time, possibly as big as the arrival of GeForce 3 architecture back in the day. I am extremely excited for the fellow developers that will get to finally use this technology in their products. The future is really bright.

    Jacob: Real-time ray tracing is the future. I’ve been dreaming of this ever since I started working in 3D. The advantages it offers for creating realism and not having to go through the hassle of all the “fakes and cheats” we currently use in Raster Rendering will be a lifesaver. I can also imagine some of the in-game uses for it – such as mirror “accessories” that you can throw on the ground to see your enemy around corners, or the difference in shadow sharpness as an enemy drops in from above, and a soft shadow becomes sharper as he gets nearer to the ground, indicating his distance from you. I think there will be a ton of gameplay uses for it if people are creative about it.

    So, because of all this, I feel like it’s just so exciting for people to witness anything using the technology right now, and to learn more about it. And of course we all feel the demo came out really nicely; I think people appreciate the time we put into it, as well the way we showed off the technology as best we could. Hopefully, we’ll get the chance to make more things like this in the future, to continue sharing the major advantages and realism that using RTX can help achieve!

    What Does the Demo Show?

    Gavriil: The purpose of the demo was first and foremost to show real-time ray tracing. As mentioned, everything is ray-traced. Lights, AO and so on. So our focus was to showcase something beautiful while also showcasing the limits of the current rasterization techniques. We have a lot of shots in the cinematic which would be flat-out impossible without ray tracing, and others that look a lot better, and much more accurate, because of it.

    A good example would be moving objects off-camera reflecting onto in-camera objects. This happens throughout the cinematic everywhere. For instance, in almost every shot you can see the off-camera yellow-robot arms reflected from the main character, always changing position while moving. Cubemaps don’t allow you to do that.

    Another good example is when the ejection pod opens we see it from a reverse angle that’s looking at the helmet of the character. We can see the door open and move as a reflection on the helmet, something which isn’t possible without ray tracing. Even reflections that happen within the screen are much more accurate with RTX. Another great example is the laser shot that welds the character’s leg. It’s a moving reflection on an irregular curved surface, only possible with RTX. Working with ray tracing is more friendly for artists too. You don’t have to spend time setting up fake light sources, sphere reflection capture, cubemaps, and so on. The lighting will just work as it’s supposed to, bouncing around like in real life.

    In conclusion, Project Sol showcases real-time ray tracing, the holy grail of graphics. This now possible thanks to the NVIDIA Turing architecture and the NVIDIA RTX platform. The entire cinematic runs in real-time while being raytraced on a single NVIDIA RTX 2080Ti.

    Concepting

    Our Use of Substance

    Jacob: Substance was used across the entire project in this case. The environment is a mix of both tiling textures and custom-painted textures using both Substance Designer and Substance Painter. Substance Designer helped to create a lot of the base materials, metals, rubbers, grates, drains, paints, etc. Then those initial base materials were brought into the engine to be used as simple tiling materials on assets further from the camera, while many of the hero assets and objects seen up close all had custom one-to-one UV mapping and texturing. This demo definitely pushes the boundaries in terms of what’s possible in real time, with a ton of 4K textures as well as some assets being almost considered “high-poly” in terms of the tri-count.

    Substance definitely helped us to achieve such a high-quality look very quickly, as well as giving us the ability to adjust to a new art direction if necessary. One example of this was that, initially, many of our materials and textures presented a lot more wear and grunge across the assets. In the long term we decided, because of the RTX showcase, that many of the materials had to be cleaner. And so we gave the assets a higher resolution look in close-ups, as well as showing off reflections and shadows more within RTX and real-time ray tracing.

    The 3D Team

    Gavriil: We have a full-time team of 6 people, and we also used some external freelancers for additional help. We completed the whole cinematic from start to finish in three-and-a-half months. It was a hard task with such a small team, but luckily we have a lot of firepower!

    Jacob: As Gavriil mentioned, we had a full team working hard to make this cinematic a reality. Although in terms of Substance users, only David Lesperance and myself worked on all of the textures throughout the demo. It was a huge team effort and I’m really happy to see the final result. The Smart Materials and lossless workflow within Substance helped us to make major changes to the project when they came up, and to texture the entire environment and characters within a matter of about one to two weeks.

    Substance Painter Video Breakdown

    Times Square

    Gavriil: RTX is a huge leap forward for graphics. It’s the future, and one of the biggest advancements that has occurred in a while. We’re really proud of this accomplishment and we wanted to showcase it in as many places as possible. We decided to be bold. The Times Square footage was part of our RTX launch campaign. It was truly amazing to see our work in Times Square.

    Substance on our Future Projects

    Jacob: We’d definitely like to expand our Substance Designer material library and base material sets. Increasing the number of surface types we have, options for control, and quick adjustments inside of Substance Painter will help us to keep a consistently high-quality look across our projects, while allowing us the speed and flexibility to knock these out faster and faster! I can’t wait to see what other new tools and tricks will be released by the Allegorithmic team as we continue forward with this pipeline.

    All images courtesy of NVIDIA

    Want to see more? Check out the artists’ Artstation pages!

    Gavriil KlimovJacob NorrisDavid LesperanceGregor KopkaIlya ShelementsevFred Hooper.

    Read more