Gensler's Creative Media Manager using Substance for Architecture Visualization
Who are you?
How old are you?
Where are you based?
What do you do?
Firmwide Creative Media Manager for Gensler: over-seeing pipeline for rendering technologies, video game engines, virtual reality, augmented reality, and motion graphics.
3D representation of the Gensler Los Angeles office
Where do you come from?
I was born and raised in Houston, Texas. However, both of my parents come from New York. I’m currently still living in Houston, but I’ve also lived in Las Vegas and San Francisco for some time. I had moved to both cities while still working at Gensler, because my assistance was needed on a major project for each of our offices.
What is your actual job?
Currently I am the Firmwide Creative Media Manager for Gensler. This means I am over-seeing our pipeline for rendering technologies, video game engines, virtual reality, augmented reality, and motion graphics. But I started at Gensler, a little over eight years ago, as just a visualization artist. My job was to create realistic looking illustrations for a wide range of project types. I’ve worked on everything from massive business campuses, casinos, retail spaces, and restaurants. I still love to do 3D Illustrations and I’m also really into photography. I have done some professional sports photography for the local major league soccer team, the Houston Dynamo, and I do a lot of cosplay photography for friends.
What is your background?
My background is actually in media arts and animation. I studied a little bit of everything while I was in school, but ended up focusing on materials, lighting, and rendering by the end of my degree. I have never studied architecture, but I’ve always enjoyed it. And with my specialization at school, architecture compliments what I like to focus on. I also have a heavy background in photography, as I find that is extremely important to have when doing architecture visualization. The two are very similar in terms of how images are created.
What are your specialties?
I typically don’t do a lot of modeling as either the model is provided to me, or the production schedule is so short it is just easier to purchase assets to speed up the workflow. But sculpting with light, and then topping it off with good looking materials is probably my favorite part of the process. But I love doing Post Processing on my renderings and photographs. It’s the icing on the cake for me!
What are your sources of inspiration?
I tend to look all over for inspiration. I have a massive library of concept art books, photography books, magazines, movies, and video games. Some of my favorite artists are Dylan Cole, Marc Brunet, Andrew Hickinbottom, J Scott Campbell, Skottie Young, Jeff Patton, Alex Roman, Feng Zhu, Ian McCaig, and too many others to list. I always have a camera on hand, and I’m always taking pictures of things I find. I really enjoy finding random designs, materials, and textures on my travels for work, or when I’m home walking around the city. It’s amazing what you can find when you stop and take a look around. I find that you’ll never know when inspiration will hit, so I try to be ready for it as much as I can. I also recommend that if it is hard for you to find inspiration, to change things up. Look for new artists, music, or travel to a new location that you’ve never been to. And traveling somewhere new could be somewhere local, the key is to just switch things up a bit!
Tell us a bit more about your traditional pipeline at Gensler: what are the different tools you use (including texturing)?
The toolset at Gensler can be a pretty diverse set depending on the project. We do have a set of tools that we consider “Standard” for all of our users in order to help out “One-Firm Firm” mentality. This essentially means that no matter which of our 47 offices around the world I end up going to, I have the exact same set up for producing my work. However, we allow additional tools to be added to the workflow. Different tools are also used depending on what phase of the project is currently at in its cycle. A typical project cycle starts in Schematic Design (SD), then moves into Design Development (DD), followed by Construction Documentation (CD), and ends with Construction Administration (CA). A majority of visualization work ends up in the first two phases. The SD phase can be considered the conceptual phase, and the DD phase is when the design starts to become more concrete with how it will actually be created. When it comes to textures, our offices have large physical material libraries that everyone has access to. When I find myself doing a rendering on a project, materials are usually pre-selected by a designer for me. There are some materials that are a little more generic in their samples, such as concrete and metals, but there are a lot of materials that are custom to the project. So a typical scanning or photographing workflow in order to get them into a 3D application would be the next step in the process.
Where does the idea of this project “Los Angeles Office” comes from?
The Los Angeles Office project is a representation of the actual Gensler Los Angeles Office. I ended up creating this project as a test bed for a majority of the new technologies, which have started becoming available over the past few years. About 99% of our projects are confidential, so it can be difficult to share the project. This is usually a clause we have with the client, but in the case of the Los Angeles Office we are the client! So as I experiment with game engines, or virtual reality, I know I can easily document and distribute the project to users for training, and use it with other partners.
How did you discover the Allegorithmic tools? Which ones have you used on this project?
I first discovered Allegorithmic’s tools back when I was looking into a hobby for making a video game a few years ago. But as video game engines started to move more into the architecture industry, I decided to take a much closer look at how to implement them into my own personal pipeline. I started with Substance B2M, as this allowed me to convert my existing material resources into multiple outputs. But I eventually jumped into Substance Designer when Iray and MDL support was added in version 5.3.
What was your production pipeline on this project and how did Substance integrate into it?
With the LA Office, I started the project long after the office opened for business. This means we really did not have the material samples lying around anymore, so I would have to photograph the materials in their exact position in the space. This is actually more problematic than it sounds. Varying lighting conditions plays a big part in creating issues when photographing materials in place, and this project was no exception to this. Yet I was able to take photographs for reference, as Substance Designer gave me the ability to re-create all the materials from scratch.
WIPs from the Gensler Los Angeles Office project
How did your use of Substance change your approach to texturing?
It was a bit strange at first using Substance Designer. I definitely had to wrap my head around a lot of the nodes inside the application, and how things end up blending together. Also, the idea of not having any kind of base that was a pre-existing bitmap took some getting used to. However, through all of my learning, I found it best to break down the material into basic shapes and then start to apply texture to the material. Just like creating any kind of art, starting with broad strokes and then narrowing down to the fine detail is a good methodology to apply to a workflow.
What would be your favorite technique with Substance?
I would have to say my favorite technique at the moment is the Directional Warp Node. I’m completely amazed at the flexibility of that node, and what can easily be produced by it. And I’m still finding new uses for it. Just the other day I was reading the forums, looking for a way to randomly offset a texture across multiple tiles. In this case it was for wood flooring. Someone from Allegorithmic recommended plugging in the Tile Generator that is driving the material into the Directional Warp Node, and the Luminance Variation would drive the tiles that get offset! I had been doing this a completely different way, and was thinking to myself there had to be a better way! And sure enough, the Directional Warp Node ended up saving the day.
Are there some features that would be great to have in Substance Painter or Designer?
This relates to the previous question. I would love to see Substance Designer work with the data sets that are generated from scanning the physical properties of materials. Being able to convert this data into a format that game engines can use, or even just the ability to apply them to the look of a material that has been purely generated would be fantastic for Architecture and Design!
What is in your opinion the future of texturing in architecture?
Physically Based Rendering isn’t really anything new in the architecture industry. Ray-tracing engines have had this concept down for a very long time, so I’m thrilled to see game engines picking up this concept. It is helping bridge the gap between the two technologies, and at some point there will be a collision between the two. Though I think the next big thing for texturing is the ability not to just scan a material, but actually scan all the physical properties of a material. And we’re already starting to see this happen now! The ability to truly represent how a material actually reacts in the real-world is going to be key for high-end visualization in any industry, not just architecture.
Will you use Substance for your next projects?
I have enjoyed working with Substance thus far, and as it applies to future projects I would definitely consider using it again!