New Hybrid Grounds Now Landing on Source
Winter is coming, material lovers (except if you live in LA). Before putting on another layer as the days grow cooler, we wanted to capture and offer you one last drink under the sunshine on the beaches of southwest of France. Our Allegorithmic scan team traded mice and keyboards for DSLR cameras, tight swimsuits and flip-flops, and together we headed south from Paris to Bordeaux to scan, tan and hybrid summer 2017 for Source.
What’s new on Substance Source today? 70 hybrid scans of various grounds and walls. From sandy beaches to rocky ridges covered with seaweed to old stone cottage walls to moldy wood docks, we designed our own process of capture, processing and exposed-parameter tweaks to deliver high quality materials in 4K.
The critical aspect in creating digital material from scan is the actual quality of the images you capture of the real material. The final material or texture quality will always be dependent on the images we use as an input.
We believe that mastering the process of capture will allow for an increase of quality in the assets we create for Substance Source but also help us better answer your needs in establishing end-to-end material creation pipelines.
As we fine-tuned our scan post-processing and our scan hybridization processes a couple of months ago we decided to get our hands dirty and close the pipeline loop with a photogrammetry process.
Here is what we achieved:
- We captured a total of 7671 images in 36MP resolution of more than 50 different environments from organic and natural scenes like beaches and pine forest floors as well as more urban landscapes like walls, roads and doors.
- Each texture is an aggregate of 60 to 200 images. From all the rushes we assembled the data into a pre-processing pipeline to create orthophotos with information about color and terrain elevation.
- Various scale textures ranging from 50cm x 50cm to 3mx3m. And this is just the beginning.
- We created custom templates in Substance Designer to crop, tile and clean the image inputs in 8K resolution (the only limitation here is your hardware).
- Our pipeline automatically performs image delighting operations as well as height corrections from pre-processed maps directly in Substance Designer.
What you will be able to download on Substance Source are hybrid materials, all in 4K resolution, with custom parameters for each material that allow you to tweak the texture to achieve a wide range of look and feel.
“Artists should not compromise between quality and artistic freedom.”
This is our absolute motto when it comes to scanned materials.
Even with a high resolution, a regular scan remains a static image of a scene or material in a given condition. This means that most of the time, searching for scan textures is a crapshoot: either they fit the project perfectly, or – as in most cases – they are close to what’s needed but useless without costly and frustrating adjustments. Hybrid scans aim to break this status quo by combining procedural techniques with scanning.
“Artists can now get the best of both worlds: accuracy and realism of scanned matter and the creative freedom to tweak some aspects of the texture.”
Within Substance Designer, we exposed custom parameters in each material to make changes according to its context. By creating masks and filters, users can make variations like raising the water level on a sandy beach material, add selective dust deposits on a road or even splatter elements within the scan.
Can’t find the parameter you need? The scan is just a starting point. Substance Designer 2017 is the tool to help you make specific procedural filters and create your own customized materials.
To learn more about scan processing, read our article on MAKING HYBRID-SCAN MATERIALS USING SUBSTANCE DESIGNER.
Creating a new part of the pipeline is also the opportunity to introduce the people behind the new technologies. Today, we hand the mic to Romain Rouffet, our newest Creative Technologist and photogrammetry expert. Romain has kindly agreed to get out from behind his DSLR for a few minutes, close Substance Designer and show us the process.
Hi Romain! Can you tell us more about yourself and your background?
Hello I joined the Allegorithmic team in Paris in June of this year. I’m curious about many things: I studied Electronics because I initially wanted to work in the field of robotisation/automation. Then, I discovered 3D creation with professional CAD software and photogrammetry with retro-engineering.
Diploma in my pocket, I started working with a small team to design and make a camera rig with a dedicated production pipeline for body and head scanning. I freelanced a lot in that area before finally ending up in a company that digitizes archeological sites all over the world. Thanks to this experience, I discovered the domain of UAV and large-scale mapping (we’re talking whole islands, for example). Now I’m part of Allegorithmic Labs, where we’re taking on (among other topics) the future of scanning and thinking about how Allegorithmic will be a part of it.
How did you discover and start to work in photogrammetry?
I started photogrammetry as a hobby when the 123D Catch software appeared on the market. I used it in engineering school to retro-engineer a carseat, and since then I continued to practice every day. What started as a hobby grew into a real passion for computational photography.
Can you tell us more about the shooting? Where did you go? What did you wear? Did you get sunburn?
I’ve been to Ile de Ré, a small island near La Rochelle on the west coast of France. Unfortunately no sunburns because in order to get a diffuse lighting on your outdoor subject, you will need to wait for a cloudy day. Note that not all the lighting are removed and this is why our team at Allegorithmic developed a de-lighting filter to remove the ambient light (keep reading :)).
What was the setup for the shooting? Which camera and hardware did you need/use?
To shoot I used a 36Mpx DSLR, a 5m pole (expandable), a remote trigger, many lenses and a color checker. Nothing more, nothing less. For the lens it’s mostly between 20-50mm.
Can you give us some advice on shooting materials?
To shoot materials, I used a remote to trigger the DSLR place at the end of the pole. The pole is two meters and is placed horizontally in front of me. Then I crab walk holding my pole in front of me. This the same thing you do in aerial mapping, except here the scale is different. You may ask yourself why the pole 2 m ahead. The idea is that when you take pictures there is enough overlap, and especially that none of your footprints if the ground is loose.
Can you describe the process after shooting? How did you assemble the images to create the final inputs?
After shooting, you need to sort your data by folders. Having a photo of the color checker before shooting another scan also helps a lot.
Next, you can use any of your favorite software for developing the RAW files. Capture One is great because it’s really quick at exporting all the data.
We then process the data with Capturing Reality to export orthophotos. Orthophotos represents the combined texture and geometry without perspective in 2D. We use 2 orthophotos, one color and one depthmap in grayscale.
These two images will be used directly in Substance Designer and will end up in a material.
Our dedicated filter converts the height into an AO map, a bent normal map, a worldspace normal, and an albedo.
This removes the painful part of baking geometry onto a plane and reduces the duration of the total process.
If you shoot .jpg, you will use only your photogrammetric software and Substance Designer.
What were the challenges?
Challenges in photogrammetry are mostly about finding a good spot, plus dealing with weather and curious onlookers.
In processing, the challenge was to get an automatic workflow after image acquisition. Baking wasn’t an option and delighting was necessary.
Furthermore, we needed to do all the post photogrammetry workflow inside substance designer. This was challenging for me because I didn’t use it before and it was a bit scary looking at those big graphs and tons of filters available. Finally, having the source team and the dev of the software at my side, I was able to quickly prototype preset for delighting, roughness creation , tiling maps that I used to make in other 2D software.
In addition, the ability to render directly with Iray in SD help me to see if things are working correctly in terms of details and quality. Here some examples:
Can you explain in more details the challenges and objectives of the filters and template you created in Substance Designer?
I want to precise that while I initiated development of these filters, the Substance Source team helped me to create them. The challenge was to make all the maps using only the diffuse and height orthophotos.
The first thing you need to do is to correct the slope of your scan. Indeed, not all the ground are truly flat and this causes lots of problems for tilling. If you bake from a low-poly model, you don’t have the problem because it follow the general shape. Since we work only in 2D, however, we need to make a filter to compensate for holes, slope and non-planar surfaces.
Next, you need to remove lighting information. Even when shooting on a cloudy day, we removed the ambient light using the normal map and AO.
Everything can then be tiled using the smart autotile filter, which can guess the edge depending on the height, the normal and the color.
What are the next steps? How do you see scan capture evolving in the future? How do you see it impacting the production of assets in Substance Source?
The next step is to gather many more materials. This first batch was a test for developing filters in Substance Designer to improve the efficiency of scan processing.
The capture will evolve to a blend between photogrammetry and multiangle light technic.
We will be able to get a good height from the photogrammetry and a good color, normalmap, roughness from the multi-angle lights workflow. (see previous blogpost about material scanner from Anthony)
Then, we will lift the camera a bit higher (on a drone, for example) to scan bigger things.
But this will be the next step.
Well, now you know more about how we created the new hybrid materials available today on Substance Source. This is only the beginning! We really can’t wait to test the process further on everything we can get our hands on: bigger, more complex, higher resolution...there are plenty of exciting challenges! Stay tuned, and we’ll make sure to share new processes alongside the next material drops on Substance Source.
Now just sit back, relax, download and test the free assets of the new hybrid ground materials on Substance Source, tweak the scanned materials with Substance Designer or Substance Player and combine more effects with Substance Painter if you dare – The digital sky is the limit!