Drive Through this Virtual Chinese City Made With Substance
51VR, as the fastest-growing technology innovation company in China in recent years, is committed to VR. They are currently working on a project in which a 440-meter-long street in the city of Chengdu, China, is reproduced, as part of AI smart simulation training for automated virtual driving. Substance technology was used for most of the material processing in this project; we invited one of their key developers to introduce us to how they use VR technology to experience the world more efficiently, safely, and as realistically as possible.
Read this article in Chinese here.
Hello, I am Guo Yuanhao. I am currently working for 51VR in Chengdu as a project developer. I am very honored and want to thank the community for this invitation. I entered the game industry in 2009, and had the privilege of participating in many game projects while working in Shanghai, as well as working on movie projects such as Transformers: Dark of the Moon. After that, I returned to Chengdu to start my own business and began using game engines to provide VR solutions for real estate marketing. I officially joined 51VR in 2017.
The reproduction of "Chengdu Street" is an integral part of the project. We used photo scanning to restore most of the material in this street, as well as some small and medium-sized items. The entire project is actually used to perform artificial intelligence training of autonomous driving in VR. Therefore, the degree of accuracy must be high to meet the real-life situation, and this allows us to reduce the level of uncertainty in the VR simulation.
At present, the entire project is still in progress. It will ultimately be expanded to a wider range of areas, and to physically larger areas, in order to meet the simulation needs of unmanned vehicles in the VR environment.
The team took more than 3,000 photos as a reference for modeling and creating materials. Diffuse color maps with light were directly generated during the entire photo scan. However, these textures did not meet PBR requirements, so we used Substance tools to ensure the necessary quality. This output a set of maps that conformed to PBR needs, and that performed well when integrated into the UE4 engine.
The Discovery of Substance Tools
I really started paying attention to the Substance suite with the release of the first edition of UE4. I initially used Substance for special texture requirements in a number of real estate projects.
Texturing Workflow with Substance Designer and Substance Painter
In our project production, we used both Substance Painter and Substance Designer to process textures. Substance Painter was first used to create textures from scratch, as it proved impossible to scan some real objects using photogrammetry. In other cases, we used Substance Painter to make corrections to photogrammetry scans, such as when dealing with parts of textures that had not been captured correctly.
Despite our best efforts, the accuracy of these maps cannot meet our requirements. For example, the scanned maps look vague. To make them clearer, we used Substance Designer: we used nodes to simulate real materials to improve precision, like basic color maps and restructuring of Roughness maps, which cannot be achieved by scanning. Really, we should have used Substance Designer to simulate real materials. However, due to the development cycle, we sped up research by using Substance Designer to modify scanned materials.
In the actual photo scanning production process, some of the accuracy of the capture still cannot meet our needs. The textures based on the photo scan might feel a bit blurry, for example. In order to improve the accuracy of the scan, we used Substance Designer’s blend nodes to simulate real materials, and restore details to our scans in order to improve accuracy. We needed to improve the clarity of our base Color map, for example, or to reconstruct the Roughness map, as that’s the kind of detail we couldn’t get through scanning. Of course, we could have just simulated materials from zero directly through Substance Designer but, given our development cycle and time constraints, we found it quicker to use Substance Designer to modify and repair scanned materials.
Substance Designer rebuilds texture, improves precision, and produces Roughness maps.
Using Substance Painter and Substance Designer in the project greatly increases efficiency, while ensuring very high quality. I personally like to use Substance Designer to reconstruct textures, or to create materials from scratch. This makes me very happy.
The Main Problems
The biggest obstacle we faced was simply that this was our first time using photography to scan objects. In our initial tests, the results of the scans were not satisfactory. With a little experience, we began applying ways to optimize the process, and to improve the success rate of captures, as well as quality and speed overall.
I personally prefer to use the Vector Warp node for material blending and production. I feel that this node is very powerful, and that it saves a lot of time when restoring material textures. At the same time, a texture and color close to the reference material can also be obtained.
The Tile Generator function node is also awesome. During actual material production, this is the essential base node of each of my materials. It allows me to define my material style very richly and at the same time allows a great deal of variation.
I hope that, in the years to come, Substance can introduce solutions that will facilitate scanning via photography.
Currently, our team is still conducting VR projects related to the real estate sector and we have plans to extend the application of Substance while providing internal training. By doing so we mean to respond to problems arising from material quality and texture, and to further improve production efficiency.
In my spare time, I like to play bass and perform with the other members of my band. I also love sci-fi movies.