Beyond Scanning: Recreating a Scandinavian Forest with Photogrammetry and Substance
In order to get great results with photogrammetry, you need to have the right skills and the right tools. Luckily, Robert Berg has both, and he was kind enough to share with us the tools and techniques he used to build his Scandinavian forest. Let’s see what we can learn!
Hey, my name is Robert Berg! I´ve been working as an Environment Artist since 2008 and I recently joined DICE as a Senior Level Artist. Previously I was Lead Environment Artist at Fatshark Games and before that, I worked as an Environment Artist at Machine Games on Wolfenstein: The New Order.
The most recent projects I worked on were the Warhammer: Vermintide games, where I was responsible for modeling and texturing large parts of the game, as well as creating all environmental shaders. For the recently announced Vermintide 2, I moved to creating all textures in Substance Designer, either from scratch or as a way of increasing the resolution of already existing textures by running them through some Substance Designer noises and blends. This is really where my Substance journey started, and I’ve been hooked ever since.
The Roots of the Scandinavian Forest Project...
The Scandinavian forest project started was back in 2014 after playing The Vanishing of Ethan Carter, a game which made heavy use of photogrammetry*, and then discovering a mobile app by Autodesk called 123D Catch that allowed me to scan rocks and stuff using my old iPhone.
I thought this was very cool and figured I’d do something with it. But the limits of the app made it hard to do anything serious, so I lost interest for a time.
A few years passed before photogrammetry really started to take off with games like Battlefield 1 and Star Wars Battlefront. That, along with some articles about using your mobile phone for photogrammetry here on the Allegorithmic blog (ed:Go Scan the World! Photogrammetry with a Smartphone and Your Smartphone Is a Material Scanner, peaked my interest once again. The new and improved camera on my iPhone 7+along with finding RealityCapture on Steam for a decent subscription price made all the pieces fall into place and I decided to do something real with it all.
I live very close to a typical mossy overgrown Scandinavian forest, so I decided to scan and rebuild that using only my iPhone as equipment!
The goal was to teach myself photogrammetry, test what works and what doesn’t, exploring techniques and workflows for creating assets using photogrammetry, and of course, seeing what kind of quality I could get with very limited equipment.
As a bonus, I also got to learn the basics of Unreal Engine again as I had not touched that engine for over five years.
"Substance is really the cornerstone of my entire workflow. I do everything from error-cleanup to delighting as well as adding additional fine detail"
From RealityCapture to Substance
Everything starts by exporting a high poly mesh with textures from RealityCapture. This could be tens of millions of tris, or a medium high poly around 1-2 million tris.
From the medium mesh I created a low poly by decimating it down to game resolution and doing some quick cleanup of ugly edges.
The initial bake of the Diffuse/BaseColor, Normal and AO is done in Xnormal. Those maps along with the lowpoly mesh are then brought into Substance Painter and Substance Designer for the cleanup work (more on that later) Inside Substance Painter I use the medium-poly mesh to bake out the additional maps such as World Space and Position Maps that can be used to add additional detail inside Substance Painter or as a part of the delighting process.
I always use the captured albedo for my assets, as that is, in my opinion, the thing that makes photogrammetry special. All those little details that are very hard to capture in a handcrafted material or model.
Substance at the Heart of the Process
Substance is really the cornerstone of my entire workflow. I do everything from error-cleanup to delighting as well as adding additional fine detail. A nice feature that was added just as I started this project was the ability to match albedo textures using a simple node to make sure they work well together within the same color range.
I do some hacky things, as you will see later, that probably make the more hardcore photogrammetry experts cringe, but for my little hobby project it worked great.
I always try to focus on the end result and overall picture. If using some cheats and tricks along the way saves me time and won´t be noticed in the end anyway, then that’s fine!
A Closer Look at the "Moss brush"
The moss is one of those things that I just could not get right by scanning. I tried and it looked horrible, so (once again) I relied on Substance Designer. I needed something generic that I could blend on any surface and by just mixing a few noises together, I got what I needed. It literally took 5 minutes to complete it – that’s why I love Substance!
Next, it was just a matter of sampling the color from some of the moss that I already had in the scans to make sure they worked well together. Since it’s all procedural, making any changes at a later stage would be a breeze.
Setting up the moss shader in Unreal Engine, I made use of real-time tessellation and a Fresnel effect to give it that fuzzy moss look. I have to say I am very happy with the result. If you are curious about the shader setup you can find the graphs on my ArtStation.
The Pros and Cons of Photogrammetry
To me, the obvious upside of photogrammetry is speed: you can get some incredible results in just a few hours using photogrammetry and capture natural details that are hard and time-consuming to recreate if you model and texture an asset from scratch.
The downside is that you are limited by what you can find and scan. And the bigger the object, the more of a pain it is to get clean results.
Even if you don’t plan on using scanned assets, I have found that just the process of scanning and looking at the meshes and textures afterwards has taught me a lot of stuff that has improved my regular modeling and texturing as well.
A Glimpse of a Possible Future
I believe that when it comes to making photoreal natural environments in games, photogrammetry is the future. The amount of time it takes to recreate the fine detail from scratch is just not efficient when you can get the same or better result in a few hours with photogrammetry. This is me speaking from a production standpoint; the actual creation process might not be as satisfying as creating something from scratch, but it all comes down to what the end goal is.
Allegorithmic has done a lot to simplify the creation process already, and I hope that you will continue to provide better tools for us artists to create even cooler stuff!
Tutorial Time: Cleaning Up Photogrammetry Data
Since only using a phone for capture is quite limiting, I often get a lot of errors or missing information on parts of my scans.
The easiest and quickest way I found to solve this was to bring the mesh and baked Normal Map, AO and Diffuse into Substance Painter and carefully using the Clone Stamp tool to stamp out all the errors. Just like you would do in Photoshop, but the beauty of Substance Painter is that it allows you to do it all in a single stroke!
Assign all the maps to their respective channels inside a Fill Layer. Next, add a normal layer on top of that with each channel set to Passthrough.
This allows you to paint all channels in a single stroke so the cloned information always matches up in the Diffuse, Normal and AO.
With some smart and careful cloning, no one will ever know!
I’ve done some pretty gnarly cleanups using this method and I find it works really well.
Here is the above branch in Unreal Engine, ready for use!
You can use pretty much the same approach when creating tiling textures from scanned patches of ground. But instead of baking everything down to a unique mesh, try to trace the largest quad possible in the scanned patch and bake that down to a flat texture.
Bring that into Substance Painter and use a tiling plane model. This allows you to clone out the seams and errors with minimal effort using the same approach described above. By tiling the plane around itself in every direction, cloning away the seam becomes very easy!
A cleaner and more correct way to do this would probably be to bring in the Base Color and Height maps into Substance Painter and remove the seams from those, then use the height to displace a plane and bake the normal from that. Because I am lazy, though, I baked the normal beforehand and clone stamped that as well. If I were to do it again, I would probably go with the cleaner approach and fix the height before baking the normal.
After removing all scan errors in Substance Painter, export the textures to Substance Designer and use the excellent AO Cancellation node to delight the asset. If shot in a good and neutral lighting environment this is often enough to get a good albedo. If for some reason, however, there is a bit more direction to the lighting, you can use this slightly hacky way of removing some of that light by plugging a World Space Normal into several Color to Mask nodes to create a directional mask that can be used to remove some of the directional shading.
Here is the final asset in Unreal Engine with tessellation and fuzzy moss shading: