Creating Realistic Archviz Experiences Using Unity, by Oneiros

  • Architecture

In today’s article, we sit down with Mirko Vescio from Oneiros, whom we interviewed in a blog post last year. This time, he describes how he and the team created an interactive virtual reality experience of the newly opened Unity office in London.

Hi Mirko, great talking to you again. For those who don’t know you yet, could you briefly introduce yourself?

Hi everyone! I am the CEO of Oneiros, a startup company based in Milan, co-founded with Ruggero Corridori and Antonella Contin, in the middle of 2016 with the aim to offer enterprise real-time and virtual reality solutions, using Unity, mainly for the AEC industry.

Tell us more about the Unity London office project. How did it all start?

This project, made in collaboration with Unity, has the aim to show the potential of their new HD Render Pipeline (HDRP) in AEC industry through the creation of a virtual walkthrough of the newly opened London office, designed and built by M Moser Associates. (Check out the real-life building here)

It is possible to move freely between the various rooms using the mouse and keyboard or a gamepad. The users are able to interact with some aspects, such as switch between a day/night mode, changing floor materials, turning TVs and lamps on/off, and accessing BIM information.

It is important not to forget that one advantage of using Unity in architectural visualization is that you can get different types of output from the same scene. For example, it is possible to make a video in cinematic mode, as well as in virtual reality mode:

Also, the static rendering images and, of course, as mentioned the interactive PC build (usable with keyboard and mouse) are available to download!

Can you tell us more about Unity HDRP for the AEC industry?

With the HDRP, there are several new features in Unity that I would define as essential to make the qualitative leap in terms of graphical rendering in the AEC industry.

For example, the planar reflections are fundamental to creating the right feeling of reflective materials; the new lights return a more realistic result and it is possible to have great control of the temperature light.

In the HDRP, the “old” standard shader becomes a “Lit Shader,” and marked the introduction of the “Shader Graph.”

Take a look at this example of Lit Shader, with the textures produced using the Substance Designer graph.

HDRP Lit Shader material settings
Substance Designer graph used to generate the textures

Another interesting feature is the decal shader. It allows one to project details easily onto any geometry. For example, one may add bricks to a broken wall without having them as geometry.

What were the different tools you used on the project?

In this project, the workflow required three main software tools. The first one is 3ds Max for the creation of all the 3D models including environment, furniture, and props.

The second tool is Substance, which we used for the creation of the materials. We used Substance Painter, Substance Designer, and Substance Source.

And, of course, we used Unity to integrate the 3D models and textures, work on the lighting setup and the creation of the shaders, and for the post-processing camera, the coding side, and the video production.

Could you give us a few more details on your use of Substance?

First of all, as I mentioned in our prior interview, it is necessary to keep in mind that in a project like this — a real-time PC build for AEC — you can only have camera effects as post-production. This means that almost everything must be done in 3D and the software needs to render as many images per second as possible (at least 30 images per second).

The PC build of this project, for example, can render 45 images in 4K per second using a mid-level gaming graphics card.

Of course, there are tons of factors to evaluate in figuring out how to improve the performance of a realistic real-time application, but one of the most important aspects is finding the right balance between the realism and the poly count of the 3D models.

With this aim, Substance is essential, because it enables us to have great control with each texture, allowing us to transfer the details of a high poly model to a low poly model, and, as we will see in the next question, add even more realism and details using masks and produce tileable textures.

Moreover, it allows the creation of PBR shaders, a composition of different maps such as diffuse map, normal map, metallic, AO, and more. Once the maps are exported from Substance, it is necessary to import everything in Unity for the creation of the shader that will be applied to the 3D model.

How did Substance integrate with the other tools of your pipeline?

One of the advantages of Substance is that it fits perfectly into our workflow. As mentioned earlier, Substance Painter is perfect for transferring the details from a 3D high poly model made in 3ds Max to the low poly model that we import into Unity.

Here is an example:

The creation of a single asset is generally a standard process. Starting from the high poly 3D models provided by the architects, we have to evaluate each topology and starting from their high poly version, we build the low poly assets.

Once the low poly model is made, we bring both 3D models (high and low poly) into Substance, where we work on the textures and achieving the PBR materials.

Substance armchair final material, Substance Painter viewport
Substance armchair 4 maps
Shader Unity armchair

Breakdown of the parquet floor created thanks to 3ds Max, Substance Designer, and Unity.

Floor high poly
Floor details
Floor multi-material
Floor low poly
Floor bake option

Once you’ve finished this step in 3ds Max, it is necessary to use Substance Designer.

Floor in Substance Designer
Floor HDRP channel

Examples of materials from Substance Source in the scene.

Substance Source is a great tool to use for the creation of realistic materials without the necessity to produce them from scratch.

For example, the carpet of the work area was made starting from a Substance Source material.

From Substance Source to the shader applied in the scene

Once we know what kind of material we are looking for, based on categories, it is possible to select from several to determine the most suitable for our situation and download it in .sbsar format.

Substance Source categories
Substance Source selected material and related download

But of course, in a custom project, one must always adapt and modify the original Substance Source material. To do that you only need to import the .sbsar file into Substance Designer and work in it.

Substance Source material opened in Substance Designer to be modified
Graph material, high resolution

Once the various maps from Substance Designer exported; it was possible to create the related shader in Unity.

Shader creation in Unity

Are there any tips or tricks you would like to share with the community?

It is fundamental for artists to understand how the materials that surround them are actually made to replicate them correctly in Substance.

It seems like a basic and obvious thing. However, it is the only way to make a real leap in quality: set up a realistic material upon which, with experience and thanks to Substance, you can add details and produce high-quality content.

Any final words you would like to add?

We have seen how helpful Substance is in this process and how Unity can be used for the AEC industry in photo-realistic content.

Unfortunately, explaining every single process would make the article infinitely long and complicated, but I hope this introduction has intrigued you and brings you closer to the real-time world with the use of Unity!

It is possible to download the PC build here if you would like to try the interactive scene and test what one can achieve using Unity and Substance!

All images courtesy of Unity and Oneiros.

Learn more about Unity HDRP.

Read more