Epic Games’ Robo Recall is arguably one of the most fun arcade shooter games for Oculus Rift out there. Besides that, it’s also one of the best looking VR games to date. We interview Edward Quintero, who was responsible for texturing the characters and weapons for the game.
Hi Edward, thanks for taking the time to do this interview! Can you tell us more about your background?
I’ve been working in the visual effects, animation and video game industries for well over 17 years now. I’ve done a little of everything including surfacing material/texture artist, matte painting, concept art and recently creative direction.
I’ve been lucky to have worked at great companies like Tippett Studios, Industrial Light and Magic (ILM), Dreamworks Animation and Epic Games.
I’ve also had experience as an entrepreneur. Back in 2003, I co-founded a design studio called Massive Black with some friends and most recently, started Mold3D.
How does your VFX background influence your work in the game industry?
I had experience working as an art manager for the games industry in the past, but it wasn’t until recently that I had the opportunity to work in a creative role in games. About a year ago I got a call from an old friend and mentor, Kim Libreri, who is currently CTO at Epic Games. He asked me if I would be interested in coming on board to help out on some of their projects.
The quality of video game art has been kicking some serious butt in the past decade or so, especially with the advent of real-time lighting game and PBR workflows. I felt it was a perfect time to try video games as I could now use a similar artistic approach that I’m used to working in VFX and animation.
In regards to painting textures, I’d say the biggest difference in games vs. VFX is the number of maps you need to create in order to help “drive” a material shader into telling it what look or effect you want to create. In film work, you really have to think ahead and be able to visualize the look of the final product in advance, and only knowing the final result at render time.
This is a disadvantage in many ways. With programs like Substance Painter, you get a “what you see is what you get” approach which makes creating so much more enjoyable, not to mention more efficient.
What I do credit my experience in film is my approach to painting. I think a program like Substance Painter makes it too easy, sometimes resulting in a procedural or similar look. So I’d like to think that my experience in VFX helps me aesthetically approach my work more carefully, and add the subtlety and realistic aesthetic that I’m used to creating for film assets.
What was your role in the VR game Robo Recall?
I was brought onto the team to help develop the look for the robots. I textured all the in-game characters, weapons, gloves and some props.
How did you start using Substance Painter? How did it change your workflow?
My first time using Substance Painter was actually on Robo Recall, now I’m hooked! I fell in love with how the app uses the generators and smart materials to quickly create wear and tear on the models, and the rest of the tools are amazing as well.
I’d say the biggest change to my workflow was how closely the look of the assets in Painter mirrored what I was seeing in VR using Unreal Engine. The advantage of painting inside a realistic lighting setup is a huge plus!
What this meant was less back and forth trying to nail your look. In VFX you have to rely on your test renders to make sure you are going in the right direction. Substance Painter was able to help me see my results in real time.
Could you describe us in detail your workflow with Substance Painter on this project?
Sure. I used Substance for painting, but more importantly, I was able to develop an internal material pipeline to make my work more efficient.
The first step was to develop the look of my first character, which resulted in a library of smart materials that were reused on every asset moving forward.
From then on, it was only a matter of custom edits to make sure the materials were assigned to the correct parts, and to control where grime and scratches were placed.
So developing these smart materials as-as first step saved me a ton of time moving onto new characters, especially since most of the assets shared similar materials.
After painting, I’d save out the textures and import them into Unreal Engine, where they would be assessed in VR using an Oculus headset. This was an amazing experience for me. Something about walking around your asset, in real time and in real scale is unexplainably fun.
Are there any tip or tricks you can share with the community?
I would say the biggest tip is to use the paint layer inside your masks to customize your Substance Painter generated smart masks. I see a lot of texture work online where you can easily tell a smart masked was used. The result is a procedural look that is easy to spot.
While smart masks are amazing, they aren’t 100% accurate. So always go back and edit your scratches and grime, or add your own custom work. I use photographs and my own masks in combination with the Painter generated ones, to give me a realistic and custom look.
What is different about creating materials and texturing for VR games?
Aside from what I’ve already mentioned above, I’d say game shaders are not as complicated as what you’d see in a VFX or film pipeline. Especially when you start going in deep with things like hair, or skin. You just don’t have the level of control yet as real-time engines have to do a lot of heavy lifting. This means lighter shaders and less resolution in your maps. 2k maps are common in games, wherein film you can be working with 8k+ resolutions for extreme detailing for closeup work.
What this means to the game artist is less texture map creation and output. In film, you rely more on creating custom alphas and maps that drive and control many material attributes. I spent more time painting black and white maps in film than anything else. The advantage is greater control and authorship over the final look.
In games, you have fewer maps to worry about but you do get the added plus of working in real time and with a “what you see is what you get” approach to painting. This makes it feel more artistic and enjoyable and you worry less about the technical aspects of creating images.
There is a pro and con to each discipline but every year games are getting closer to what we are used to seeing in a VFX pipeline. Without a doubt, real-time technology and creating for real-time engines is the future.
You are co-founder of Mold3D. Tell us more about it!
I started Mold3D about 4 years ago with a friend and colleague Robert Vignone. It was our intent to create a brand/website that would focus on art and education. At its inception, we were focused on 3D modeling and 3D printing but recently have started to cover emerging technologies and design.
One of the results of creating Mold3D was our online school, Mold3D Academy. We offer online classes taught by professional artists and are happy to announce 2 upcoming Substance classes to our summer term lineup.
We are developing a Substance Painter class taught by Christophe Desse of Naughty Dog fame, as well as an in-depth Substance Designer class taught by Pete Sekula. Also in our lineup will be classes that focus on real-time character and environment creation. So VR and video games are definitely a focus for us this year.
Will you be using Substance for future projects?
Yes! At the moment I am working on a secret VR project for an upcoming VR platform. I try to occasionally take on side projects in order to stay relevant in the industry and I’d like to think the experience reflects in the type of classes we develop for Mold3D as well as my personal work.