Project Octopus: Building Interactive VR Experiences for Archviz with BVN Real
I’m Barry Dineen and I lead the VR team at BVN in Sydney, Australia. I’ve been working in the architectural visualization industry for 13 years, but the last 3 have been primarily focused on VR.
And my name is Mateusz Sum. I’ve been working as an architectural graduate for about the past 5 years. However, around 2 years ago I transitioned to the archviz branch of the industry, focusing on both still images as well as real-time graphics in VR.
Barry: BVN is an international architecture practice working across a wide range of sectors and scales. The VR team, soon to be known as BVN Real, is made up of artists, architects, developers and experience designers who are passionate about creating immersive experiences. We’ve evolved from the architectural visualization division at BVN into an immersive realities production studio that aims to blur the line between the real and the virtual.
Initially, the VR work that we were producing was mainly archviz marketing experiences for a variety of BVN’s unbuilt projects, primarily within the real estate and education sectors. Over the course of the last 3 years, we’ve expanded into design prototyping, training apps and, more recently, living simulations of real spaces (aka digital twins).
While we were still getting to grips with real-time workflows, our VR experiences took a substantial amount of time to produce. Studying and utilizing game industry techniques and processes have significantly increased our speed, allowing us to get involved much earlier in the design process. Architects and designers are slowly realizing that VR is not just a viewing device for a real-time model - it’s a completely new medium, and using it effectively requires a level of visual quality and fidelity that you simply do not get from real-time rendering plugins for design software. In addition, an enjoyable VR experience requires significant user experience elements at every stage of the user’s journey.
But VR really excels when users have things to do, and in the real world people don’t just look at buildings, they do things in buildings. We’ve been promoting the idea of using VR experiences to go beyond visualization to allow us to user-test designs early in the design process, with the people who will actually use the built space. Project Octopus VR was our first such design prototyping experience.
Barry: Project Octopus is a flexible workplace system created by BVN to enable a self-organizing studio environment that gives everyone full, agile desk mobility. To design such a unique system required an advanced prototyping approach that would allow us to test the operational realities of the project in a cost-effective, time conscious, and non-disruptive manner. We developed a simulation experience that allowed the designers to quickly iterate through potential solutions and finalize the design within the limited time frame.
This simulation quickly evolved into a VR training environment that gives end users an understanding of the design and functionality of the system. Once people know how the system works they can move, assemble, organize and complete a workstation configuration that will work well for their team. The multiplayer functionality allows teams to finalize a configuration together, minimizing the time, effort and disruption involved in putting together configurations that don’t work in the real world. Having more than one person in VR has been an essential feature of nearly all of our VR experiences since our early prototypes. We do a lot of user testing internally and the multiplayer functionality is consistently noted as the key preferred feature after the visual quality.
While not part of the original brief, we realized that we could further utilize the VR training environment to help us track the workstation positions in real life. We have since been able to create a virtual replica of our workplace and we are working with our software development team to utilize their desk location sensor information to bring this to life.
Barry: So the BVN design team went all out on this one, as the new workplace was part fitout, and part product design. They used all the usual design packages including Sketchup, Revit, Rhino, and Grasshopper, but a substantial part of the new design relied on a robotics research project that ultimately determined the final form of the Octopus system. The VR team then brought all the model assets together in 3ds Max to prepare it for import into UE4. The initial plan was to keep the environment quite minimal visually so that users would concentrate on testing the functional design of the new system and not get distracted by detailed surroundings.
While this approach worked fine during the early stages, when we enlarged the experience to test the system in multiple locations simultaneously the relationship between the system and the real space became a key aspect of the design. In order to legitimately test the design intent, we needed users to feel like they were actually in the real space. This is really where Substance Designer went from being a useful tool that we were getting better at controlling, to an essential piece of the process that we couldn’t live without.
It was a real eye-opener for us to realize the level of fidelity and detail that we actually needed in our textures and materials to create a true sense of presence in VR. This was even more important in Project Octopus as the VR experience is used within the real space that is represented, giving users a 1-to-1 comparison between both. Very few of our existing texture libraries even came close to the quality or resolution that we needed, so we had to create them using Substance Designer and Substance Source. Working in the space that we were recreating, however, gave us the easiest access to material references that we are ever likely to get. Once created we could just scale up the textures as necessary after testing in VR.
As we began to develop the digital twin idea, to track the real-life desk positions, we needed to add a lot more geometry to the experience. We started experiencing performance issues quite quickly as we needed to add more desks and we were nowhere near the 180 that we would eventually need. Luckily our new recruit Dom had quite a bit of experience using Substance Painter and he was able to convert our high-poly models into very low-poly models with little or no reduction in visual quality, simply by replacing the modeled details with textures. If anything, the visual quality increased as he was able to create the textures from much higher res versions of the models than we were using originally. I’m still amazed by the believability of the screws on the desk lamp at just a few centimeters’ distance away.
Integrating Substance into our Workflow
Mateusz: We’ve been digging into game development production techniques ever since we started taking real-time VR seriously. Getting started with game engines like Unreal Engine or Unity is pretty straightforward, but the more we use them the more we realize we don’t know much. It’s a really steep learning curve from traditional archviz, with many new workflows and methodologies to wrap our heads around. PBR materials were just one of the new things that we needed to understand, and it wasn’t long before Substance appeared on our radar.
We’ve been using Substance Designer and Substance Source now for about a year and a half and it’s really surprising that it isn’t more integrated into traditional archviz workflows. I think there was a perception within the industry that it’s much harder to get started with the tools than it actually is. We were able to get up and running pretty easily and get to our desired results without having to utilize the full capabilities of the Substance suite. I think people can be intimidated by the sheer complexity of the amazing materials that are often shared online, but most archviz materials just don’t need this. Luckily, more and more archviz studios are understanding the advantages and getting involved.
Using Substance Source
Mateusz: I would say the main benefits of Substance Source are the ease of use, simple access to the library, and cross-platform integration, whether it’s a classic 3ds Max + Vray combo, or Unreal Engine, or others. Substance Source resonates strongly with the way the archviz community has been functioning for years, relying heavily on libraries of high-quality models, shaders and other assets ready to be used right out of the box. Usually, we work with tight deadlines and rely on simple “drag and drop” methods, so Substance Source fits into that pipeline pretty seamlessly with an already substantial library of materials. Substance Source materials are also a quick and easy input to use in Substance Designer, that can be blended and altered to create new unique materials. Whether it’s about the overall size, different tiling or additional details - all the necessary tools are right there!
Mateusz: In Project Octopus we used different techniques to generate shaders, and all of them came from Substance, whether it was a fully defined shader or just simple grunge maps for additional details. For less demanding cases, like timber, we used assets straight from Substance Source with some minor tweaks - to the color balance, for instance. On the other hand, to get some grungy and dirty concrete materials we simply blended different Substance Source materials and applied additional details on top of this using Substance Designer. As you can see below, getting good, satisfying results doesn't necessarily mean huge node maps and over-complicated shader structures.
The most challenging and fun part was to get the look of the air ducts, and the insulation around the right; this was quite an unusual material. That was the moment when we realized how much detail and fidelity we can achieve just with a bunch of good-looking textures and extremely low polycount for VR.
For the ceiling material, we decided to add an extra layer of definition. First, we recreated those poured concrete panels with some messy joint lines and then added a network of cables on top of that. No modeling, all in Substance.
Benefits of using Substance materials on archviz projects
Mateusz: Firstly, what makes Substance great is absolute freedom, and control over the look of a shader with instant feedback. At some point, when you are familiar enough with the software, the process of creating new materials simply becomes enjoyable and fun. It’s surprisingly easy to get carried away and dive too deep into the process.
Secondly, iterations and seamless tiling. The ability to generate countless variations from one core material is priceless for visualization artists. That’s something we’ve always had to fight within archviz workflows - breaking the repetitiveness of materials in the scene while maintaining a consistency over a large surface area.
Thirdly, quality and high-resolution output. It is up to the artist how far they want to push the fidelity of a procedurally generated material; even to the point where it mimics scanned materials. Once it’s ready, you can export a 4k, 8k or 16k texture if you need one. During the development of Project Octopus, we quickly realized that we would need more details than 2k or 4k textures could provide. As the resolution of VR headsets increases the resolution of our textures will need to increase with them, and Substance essentially makes us future-proof against technology changes.
The Archviz Market is Evolving Fast
Barry: The future is most definitely real-time. You simply get way more for the same amount of effort. Whether it's the archviz market explicitly, or architects and designers doing their own visualization work to communicate ideas, real-time and particularly VR is just a much better way to understand and appreciate unbuilt space - why look at it when you can go there for yourself?
Even though we believe that VR is the best way to experience unbuilt places it is still difficult to use for our clients, and it isn’t very distributable. By combining UE4 and Substance we’re able to attain a visual richness that allows us to capture high-res still images and movies directly from the engine, fulfilling all of our clients' requirements from one real-time model.
Right now we still need to spend a significant amount of time creating custom shaders and textures using Substance Designer, but it’s worth the effort given the range of output that we can get. However, we’ve recently started to see some early materials from Substance Alchemist sneaking out, and this is potentially a real game changer - not just for the archviz industry but for any architect or designer communicating their own designs. The ability to quickly create high-quality flexible materials from just photos and samples seems too good to be true - let’s hope the magic is real!
Anything we forgot to ask or that you would like to add?
Barry: We really want to see the archviz industry embrace the full potential of real-time and VR. Archviz artists have an incredible and unique ability to recreate the real world in order to present a future world that doesn’t yet exist. VR allows people to step into those worlds and inhabit them, but enabling this to happen successfully requires many new skills beyond just visual representation. It’s a steep learning curve, but there is currently a real opportunity for the archviz industry to move into the VR market beyond just visualization. As the notion of virtual worlds moves from sci-fi to reality there is potential to collaborate with architects and designers to design, create and deliver virtual places that will never exist in the real world.
Oh and thanks for making Substance - I’m not sure this project would have even been possible without it. :)