• Portfolio
  • Blog
  • Contact / About Me
The Art of Alex Jamerson
  • Portfolio
  • Blog
  • Contact / About Me

Unreal Substrate, Forged Composite, and a Free Substance Material!

(Rendered in Unreal, using Substrate for material blending.)

Unreal Substrate

I hadn't yet had a chance to utilize Unreal's new Substrate system for creating any 3D content. So I thought about a quick idea to put something together in the engine with the new material framework, inspired by a real-life model I have of the same material in my office.

It's important to note that Unreal's new Substrate system isn't something radically different to other renderers. Under the hood is the same type of shading system you'll find in other applications like Blender or Cinema4D. So if you're familiar with those, you'll already be familiar with Substrate.

What is important about Substrate and what it brings to Unreal is the following:

  1. Well-integrated and built-in material blending system.

  2. A much needed update to Unreal's aging shading model.

What is Forged Composite?

Forged Composite is a type of carbon fiber which is different than traditional carbon fiber. Instead of weaving large continuous pieces of carbon together, with Forged Composite, small pieces of carbon composite material are placed in random directions and encased in resin.

What's important for digitization, forged composite has a unique look which creates challenges for recreating this material for 3D rendering.

The Reference

I happen to have a sample of the material, which was gifted to me. This is immensely useful for recreating the material's marble effect.

I didn't want to match the reference precisely, instead I wanted to exaggerate its best aspects. Specifically the splattered marbling effect of the resin-encased pieces of carbon. I chose to ignore the reflective flakes in the resin.

On my sample, a flaked paint was sprayed on top of a satin-clearcoated forged composite. I decided this would be fun to recreate using Unreal's Substrate.

Aside from the marble appearance of the material, I did want to recreate the gradient painted effect on my real-life personal reference. This appealed to me because it would give me a chance to utilize Substrate's material blending system.

I would end up blending together 3 materials with Unreal's Substrate, one for the forged composite at the bottom, another for the clearcoat effect, and finally one for the paint which was the very top layer.

Recreation in Adobe Substance Designer

Substance Designer has become one of the primary tools for me when texturing assets, especially large scale production challenges and any kind of unique asset. Adobe Substance Designer and Painter are extremely powerful and they are perfect for any texturing need.

I'm not going to break down every aspect of how I created the spattering marble effect in the textures in this article. However, I am sharing the Substance file I created completely for free in a link below. Feel free to download it, check out how I did things, use it in your own projects.

Anisotropy and the Marble Texture

The key to recreating this material digitally is anisotropy. Anisotropy affects the direction of the specular reflection on a material. Without anisotropy, specular reflections travel uniformly in all directions.

Real life examples of anisotropic specularity can be observed in things like horse hair, or the readable side of a CD, and of course carbon fiber!

The industry has become a bit split on how to describe anisotropy in a material. Kind of like Beta-Max versus VHS, the industry is battling between Angle (grayscale) versus Directional (vector) maps to describe anisotropy. It seems like Angle anisotropy is winning out.

Free Substance Material!!

I get a kick out of giving away my work lately, so here is the Substance material I created for the demonstration.

https://www.dropbox.com/scl/fi/rcfmpq15m04j5k0w9cky9/forgedComposite.sbs?rlkey=5fmdgmei2745y7mpa3f9418fr&st=ccwtv4eo&dl=0

In it, I also include a version of Dongkoon Yoon's Direction-Angle converter, which I modified to be more useful for my use-case. Unreal uses a simple float for anisotropy angle, but I wanted to also have the option of using a vector map (or anisotropy direction) in cases where I need that instead. So I included the converter.

Unreal Material

Thanks to Unreal's new shading model, adding anisotropy to a material is trivial. All that's needed is an understanding of what the effect is and how to apply it properly. I also blended two more materials on top of the base forged composite material, to achieve the clearcoat and the gradient paint, respectively. I go over how that's done below.

Material Blending

Substrate's material blending is handled by a variety of 'Operator' nodes that are all useful in different contexts.

From bottom to top, my material layering for this material is as follows:

  1. Forged Composite

  2. Clearcoat (Substrate Add)

  3. Paint (Vertical Layer & Coverage Weight)

I use the 'Substrate Coverage Weight' node to blend the paint material over the top of all the other materials using a gradient.

The Base Material - Forged Composite

The implementation of the Forged Composite material from Substance into Unreal is pretty straightforward. The thing I found interesting was how anisotropy angle and anisotropy level is represented in Unreal's Substrate - I have never seen it done this way before.

Usually when I see renderers implement a greyscale Anisotropy Angle method, they'll have a separate texture/channel which provides the Anisotropy Level (or 'intensity'). But with Unreal Substrate, they chose to use an HDR map for Anisotropy Angle, with a range between -1 to +1. So -1 not only represents the angle/direction of the anisotropy, it also represents the intensity of that anisotropy in that direction, and +1 represents it in the opposite direction. Very cool!

Clearcoat

The new default shader in Unreal has built-in clearcoat parameters. However I found its effect to be unrealistic for my use-case. So for my shader, I chose to layer in a separate clearcoat using the Substrate Add node.

I believe the reason why the default clearcoat looks weird is because it's using the normal map from the base material in the clearcoat. But this isn't how clearcoat appears in automotive products. A smooth clearcoat is needed in these cases.

Unreal's documentation repeats warnings about using Substrate Add, generally advising developers not to use it. It's just important you don't add values/parameters from layers beneath it.

Though I found an issue with layering on top of clearcoat materials. When I layered the paint material on top of this clearcoat part of the material, I found this would cause artifacts in the material. This appears to be due to a limitation in Substrate - layering more materials above clearcoat can cause issues. To work around this, I reused the mask that I made for the paint material in the clearcoat coverage parameter. (Only when the paint effect is turned on.) This helps reduce, but not completely eliminate, artifacting issues for now. Later, I'd like to investigate using Substrate's Building Block nodes to re-architect this shader to see how else to achieve this material.

Painted Effect

I parameterized the paint color and a toggle, along with other parameters, to make it customizable per-instance.

The paint is pretty straight-forward, and although with the base material with forged composite I reverted back to the 'UE4 Default Shadeing' node, I kept the paint layer a Substrate Slab. I experimented with different root nodes to see if I got differentiated results, but all that I experimented with seemed to produce the same results, which is good.

Otherwise, the paint is basic. I should go back and re-adjust the reflectance values and otherwise polish it, but this wasn't the focus of this quick project so I just built it quickly. There's a couple of quick parameters for the paint, such as to control the color, or to completely remove the paint and only have the forged composite + the clearcoat, if the user desires.

I procedurally created a gradient for a mask to blend the material instead of making a texture specific for the UV's of my model. This reduces drawcalls while also making the material more universally applicable, rather than specific to one model's set of UV's.

Conclusion

Unreal's Substrate brings a modern shading model, and more importantly, a powerful shading framework to give users as much potential flexibility in creating materials as they would in an offline renderer like Blender, Cinema4D, V-Ray, etc. This is very powerful!!

Keep in mind, unlike traditional offline renderers, Unreal is real-time, which means it comes with performance/hardware constraints. What platform are you developing your shaders for? This is something you have to keep in mind as you develop Substrate shaders. They may not support the complex shaders you want to create. So either you'll have to work within those constraints as a limitation, or build multiple shaders as fallbacks when users with lower hardware try to run your game/application.

If you need an even deeper understanding of the concepts I described in this article, or are looking to hire an expert material artist for your project, I'm open to Linkedin DM's and e-mails for inquiries.

Wednesday 05.21.25
Posted by Alex Jamerson
 

Composition for the Visual Narrative

In storytelling, a picture is more than an image containing a collection objects. Cinematic shots are purposefully composed to service the story visually, like a stage in a play….

Read more

tags: Unity, composition, lighting, environments, cameras
categories: Behind the Scenes, Design, Realtime, Unity
Thursday 02.03.22
Posted by Alex Jamerson
 

How to Create a Bad Reception Material in Unity

The TV & its Story

See how I created this animated shader - and get a free GLB download of the CRT in the article!

Read more

categories: Behind the Scenes, Realtime, Technical Art, Unity
Tuesday 02.01.22
Posted by Alex Jamerson
 

How I Rigged Aircraft Landing Gear in 3D Software

Technical Rigging for Animation

PBY-5A Landing Gear

Deforming tires, multiple piston rigs, rotating fairings, deforming brake lines & tires, all working off of a single control parameter.

Based on a real-life aircraft, I’m re-creating a PBY-5A, an amphibious WW2 submarine hunter. I’m working closely with reliable references such as historical books and even a generous local museum.

Read more

tags: technical animation, Tech art, technical art, modo, aviation, animation, Rigging
categories: Behind the Scenes, Rigging, Technical Art
Sunday 09.12.21
Posted by Alex Jamerson
 

Viral Storytelling - Nvidia #SolRemix Contest Entry

I’ve been very impressed with the amount of work Nvidia has put into the RTX design marketing push. And I was extremely excited when they opened up a contest for the public to participate in marketing this great hardware, while also having the chance to win 3 incredible prizes.

The top prize is what I was most interested in. The new Razer 15” RTX Studio laptop. Even before these laptops were announced, I was hoping I might be able to try to live a more mobile lifestyle if I find remote work or stick to freelance. This laptop would go a long way in making it feel like I’ve got a powerful workstation, despite not sitting at home with a big box warming up my room that I can’t pick up and take with me to an AirBnB in Marrakech.

The Contest Itself

Contest announcement page link:
https://www.nvidia.com/en-us/design-visualization/project-sol-contest/

I remember watching the original Nvidia videos they released with the Project Sol characters. The male and female characters both dressed in ‘Iron Man’ style powered suits. And at the same time, they announced this contest where they would give away both the Sol characters so the community could use them to make impressive images to post online with the #SolRemix hashtag.

Once I saw the contest and the prizes, I went back and re-watched all the short films that Nvidia released with the characters a few times. I wanted to use the tone that they had established to base my own work off of.

I also read through the official contest rules as much as possible. I decided that I wanted to do more than just pose the characters for still images - I wanted to make full on short films, if possible. And I wanted to do it all in Raytracing since I purchased an RTX 2080 for my work machine and I’ve already been working with the Unreal Engine’s real-time lighting.

So when I read the rules, it stated that images or video could be used to submit an entry. The contest announcement page only mentions still images, so I was happy to see this information in the rules. Once I saw they were accepting video, this is when I decided to pull full effort into my entries.

My First Entry: A Test Animation

Before I made a much bigger animated short, I knew that I needed to test some of the production methods I wanted to employ.

  • The biggest one was importing the Sol characters in my main 3D Application (Modo).

  • Then using the supplied rigging and skinning from the Unreal import.

  • And finally re-target Motion Capture data for my own use.

I’ve done all 3 of those things before, but it’s been quite a while since I had tackled those tasks and I was also using different applications at the time.

Unreal provided .UEAsset files for the characters, and they said they also supplied Maya files for the characters. I never looked for the Maya files since I’m using Modo, but I’m sure those were very useful for Maya users.

Instead, I immediately opened the Unreal Project that they provided and exported the characters from Unreal into an FBX format. I imported this into Modo and tested the character rigging and skinning. For the most part, the rig was working well and didn’t have any major issues - except for some serious skinning issues on the hands.

But Modo, and many other applications, has a really useful way of dealing with this issue. What was happening was, these vertices on the hands were getting far more than 100% value of influence from various bones on the rig. Therefore, when I rotate the arm, the hands really get distorted, because the bones are influencing the geometry by OVER 100%, so it’s multiplying onto itself.

So I used a simple Normalization folder in Modo’s deformer tab and places the skinned geometry items instead of that folder. Problem solved.

Retargeting Motion Capture in Modo

The next step was to do something I have never done in Modo before: Animation Retargeting.

I’ve done this in XSI: Softimage as well as Motion Builder, but never in Modo. So I looked up documentation to see if it was even possible in Modo, and to my delight, they have a whole suite of tools for this purpose, and it’s very well documented.

https://learn.foundry.com/modo/content/help/pages/animation/tools/retargeting.html

MIXAMO: A Resource for Mocap Data

Because I wanted to make a full-on short film in a very short amount of time (about 3 weeks), I knew I wasn’t going to be able to do all the animations by hand myself. I love animating, but it’s time consuming. So I decided that I’ll utilize mocap data as much as possible, then do my own animations where needed as well as supplement animation on top of the mocap animation.

So Adobe’s Mixamo was an incredibly useful resource for me when undertaking this project. In the final short film, it helped me establish a pre-viz animation to get an idea if the story I wrote made since and if the shots I setup worked visually. But it also helped me have final animations in my short film. Because of Mixamo, I was able to focus on whether or not the story worked and the shots worked, rather than if my animations needed more work.

And retargeting the animations I downloaded from Mixamo worked very smoothly. I encountered no real issues using Modo’s toolset, other than a couple of minor bugs that I reported (but they in no way impeded my work).

What Was The Test Animation?

The animation itself wasn’t much, but I needed to make something quickly to test my production workflow. And I also wanted to feature the real-time raytracing in the Unreal game engine and with the RTX cards. So I designed a tunnel for the character to sort of ‘burst run’ through. Lights on the ceiling and a light at the end of the tunnel. I attached a camera to the character’s rig that was pointed very close to the characters face (for real-time reflections), and I hit play.

It’s a silly pointless animation, but I figured if I was making a test, I may as well make something I can submit. It can be seen here:
https://www.instagram.com/p/B1SlZMBnCDL/

The Final Short Film

So with production methods tested and knowing that I could do the work to the quality I wanted, I worked on a story. 

I thought it might be best to create more “GIF-style” short films, rather than any other kind of shorts. So something comedic and light-hearted that people would think is funny, and they may retweet it, and it’ll hopefully get a lot of attention because of the tone and style.

I had one very short story that I thought would be funny, the male character in white armor would be at a coffee bar, a coffee would slide into frame with his name written on it, then he would try to drink it and the coffee would spill everywhere.

I knew I could develop the VFX needed for the story. I have a strong background in creating complicated shaders, including those that need to be animated.

Houdini liquid simulation:
https://www.instagram.com/p/B1rVLBYHm7f/

The Final Final Short Film

I got so far as creating that liquid spill simulation when I came up with a different idea. A vending machine of video cards where the character is trying to retrieve a stuck item, then the female character comes into frame and solves an issue the male character can not overcome.

I decided to temporarily (hopefully) abandon the coffee spill idea for for this new story idea. I thought this story was a much better choice because:

  1. It includes both Project Sol Characters.

  2. It continues the tone that Nvidia established with their short films: The male character thinks he is awesome, but he’s really just goofy and inept, especially relative to the female character who makes him look silly.

  3. It had better branding potential with the Nvidia vending machine.

  4. It has more funny potential!!

Once I wrote the story and setup a pre-viz animation, I knew that the animation would come out to around a minute long. I wanted this to be under the time limit restrictions for Instagram and Twitter video uploads. I didn’t want to upload to YouTube, then put a link in a Twitter because I knew that simple barrier often stops a lot of people from even engaging with the material. I wanted the animation to be playable on the platform people are viewing the content from to eliminate any loss of retention and engagement.

Thinking Marketing

But I also knew, based on the contest rules, that you could submit as many entries as you want, although you can only win for one entry. So I decided I would take my 1-minute long short film and design the story so that I could cut pieces out of it for simple ‘GIF-Style’ short clips.

“So even though I only made one short film that was about a minute long, I actually ended up with 5 entries total”

For example, the very beginning when he selects ‘A1’, and the machine displays ‘Vending…’, the video card drops, gets stuck, and his ‘WTF!?’ reaction is observed.

That by itself is one clip that I entered into the contest, and I did this 4 more times. So even though I only made one short film that was about a minute long, I actually ended up with 5 entries total that could all get various amounts of attention depending on what took off the best. 5 or 6 pieces of content from just 1 piece of content. MARKETING!!! :-D

The Vending Machine Design

I got to work designing the vending machine. This was simple; I looked at a bunch of vending machine references, including modern complicated ones that actually do vend electronics. But I decided instead to go with a really old design that everyone is familiar with, then just loaded it with video cards.

I wanted to make something that said, “NVIDIA!!! RTX!!!”

Here's the turntable for the vending machine model I made for the short!! pic.twitter.com/JvSbQVplr4

— Alex J (@DevDink) September 27, 2019

11 Likes, 0 Comments - Alex Jamerson (@dinkymod) on Instagram: "Here's the turntable for the vending machine I made for the short!!"

Afterall, the point of this contest, for Nvidia, was to market the RTX campaign and get people aware of the products they and their affiliates are selling. So I wanted to help with that in my entry.

I modeled the vending machine as quickly in possible in Modo, UV mapped the item, gave it a few different material ID’s to increase texel resolution - especially in areas I knew that the camera would have close-shots on, such as the display screen. I did all the texturing in Substance Painter, aside from the “Insert Bill” floating decals on the input panel, that was done simply in Photoshop.

“The reason why I chose to have the stuck video card swing infinitely in an impossible way is because I wanted it to be like “Finger-Wagging.” The vending machine is saying ‘No, you can’t have it.’ ”

For good measure, to make absolutely sure no one is confused about what this big box is, I added a lit sign on top that said it’s a vending machine for Nvidia graphics cards. I’m not trying to talk down to the audience, but I honestly have no idea how briefly people will view these clips, some of these clips may only be a few seconds long or have no setup for them to understand all the items in the scene. So that’s why it was important to put this vending machine sign atop of the machine. I just wanted to eliminate any confusion, while also making a design that appears real.

Also, because of how the story was visualized, the vending machine became a character and a villain to our main character. The reason why I chose to have the stuck video card swing infinitely in an impossible way is because I wanted it to be like “Finger-Wagging.” The vending machine is saying ‘No, you can’t have it.’ And all the forceful attempts the character puts against the vending machine, it doesn’t budge at all, and the video card just keeps swinging. The vending machine is bullying and teasing our character!

The coils used for pushing out items are all static, except for the two in the upper left corner that push out the Titan RTX that the character orders. Those are animated inside the Unreal Engine sequencer to rotate individually, and I also animate the video cards in that row the move forward. 

And obviously, the item that needs to drop is animated inside of the Unreal Engine. The animations for the item were all done in the sequencer, the push forward, the drop, the getting stuck, the continuous swinging, then the final drop into the slot, is all done inside of the Sequencer.

I also made the item retrieval flap at the bottom of the machine capable of being opened so that I could open it when the character is sticking his hand in the machine. It does open unrealistically, it opens in a way that makes it easier to stick your hand in the machine. But realism isn’t important here, and I needed to suspend that for the sake of the story. 

In conclusion, I wanted to make sure that if people did retweet my entries, that everywhere these videos got retweeted, people knew it was about Nvidia’s RTX cards. And I think the Vending Machine design accomplished that, so I’m happy with how it ended up. 

Here’s the finished short film in its entirety. Note that this is one of 5 clips I published as entries into the contest:

Here is the short film I made in its entirety as my main entry for the #SolRemix contest! This was so much fun to make! Thanks for the opportunity, @NVIDIACreators!!! #RTXon pic.twitter.com/NHaiGko2Vm

— Alex J (@DevDink) September 27, 2019

19 Likes, 1 Comments - Alex Jamerson (@dinkymod) on Instagram: "Here is the entire short animation I made as my main entry into the #SolRemix contest! This was so..."

Reception

The reception of my entries was very satisfactory! On my instagram and twitter page, it got a minor response. I’m more active on Instagram in my daily life, so I got more attention there, but it still wasn’t much. I pretty much never use Twitter so at first, my entries got little attention.

However, Nvidia retweeted my entry.

Who knows that feel? 😢

Created by @DevDink in @UnrealEngine for our #SolRemix challenge. #UE4 pic.twitter.com/xSWULJh8Ka

— NVIDIA Studio (@NVIDIACreators) October 3, 2019

Once Nvidia retweeted one of the clips, that’s when things exploded. My entry got nearly 500 likes, over 70 retweets, and over 30 responses. That’s way more than I was expecting!!

This is especially good because it’s way more than most of the other entries. And to be honest, I was quite worried when I saw some of the other entries, as there’s a decent number of impressive ones. I’m also not the only one that submitted animations, and my animations also weren’t as complex and polished as some of the other entries. So I’m happy people really seem to be noticing and liking my entry!

So Why So Much Impact?

I believe my entry did so well for a few reasons, one reason is simply because Nvidia retweeted it. 

But some of the other reasons are because it’s a pretty universally relatable thing that happens. If I go up to someone and tell them something I bought got stuck in the vending machine, they’re going to say that it’s happened to them too. 

There’s a point to the story: It’s not just the two characters doing something cool like driving a car really fast or fighting with lasers and explosions. There’s a clear and relatable reason to the story I put forward, and people can relate to that struggle. No one likes when things we want get stuck in vending machines and it’s happened to all of us. It’s like the machine is taunting us!

And vending machines, especially basic ones, are found throughout the world. The story is also clearly directed and designed in the visuals, the vending machine is simple and easy to understand, the branding throughout the machine is clear and easy to read even with video compression, and the character’s actions and desires are easily understood because of the way I shot shot and directed it.

Conclusion

In the end, I’m very proud of my entry. I wrote and pre-viz’d a funny story, modeled and textured assets, developed an animation production workflow that included mocap retargeting and hand-animated assets, and my entry got a lot of attention.

Could the animations be better? Sure. Could the environment it all takes place in be more detailed and fun? Definitely. But the point of the animation was to tell a funny story. So I prioritized that above all. Once I completed the entire animation project in the most minimal way, any spare time I had left over was so I could go back over and polish animations or do more lighting and environment work. You can’t get bogged down by spending days animating a characters fingers if, in the end, most people won’t notice at all. And this can prevent you from finishing the entire project inside of the deadline schedule you have. Get the most important stuff done first, then go back and polish ‘nice to have’ stuff.

“Get the most important stuff done first, then go back and polish ‘nice to have’ stuff.”

I don’t believe Nvidia has contacted any winners yet, they certainly haven’t made any announcements. But I’m keeping my fingers crossed because I would love that laptop and I definitely could use it. Either way, I’m proud of my animation. 

Friday 10.25.19
Posted by Alex Jamerson
 
Newer / Older