Unreal Engine's Not-So-Secret Features
MipMap #018: Assembling Techniques for Exemplary Environment Art
With the launch of Unreal Engine 5.4 last month my attention has been heavily focused on delving into the not-so-obvious techniques available to artists.
There is no doubt that the engine is suited for creating video games, but recently Epic has been pouring a lot of resources into making their flagship product more viable for every industry they can feasibly enter. These efforts have produced such innovations as Nanite and Lumen, the two feature sets that dominated the conversation when Unreal Engine 5 launched.
However, Unreal Engine has always been able to produce stunning renders through a handful of features that aren’t as flashy as Nanite or Lumen. In fact, these features are still ridiculously powerful (when implemented correctly) and serve as the foundation in which both Nanite and Lumen can excel even more.
It’s when everything is put together, when all of Unreal Engine’s bells and whistles are whirling at their fullest potential, that we see the amazing environments that artists and developers are able to assemble.
Thus, this edition of MipMap is dedicated to providing an overview of the more prominent techniques that Unreal Engine is capable of. Starting with the big, flashy elephants in the room.
The Big Boys: Nanite & Lumen
Nanite and Lumen are perhaps the biggest innovations we’ve seen in real-time engine in quite some time. The shear potential that these two systems unlock is extremely powerful.
The problem comes with how novel the technology is. While Epic Games has been hard at work tweaking and adjusting the ways in which both systems work, they aren’t bulletproof. Far from it in fact. But lets talk a little more about each in order to understand the full picture.
Nanite is a virtual geometry system which can dynamically change the resolution of a mesh depending on proximity and visibility. If you stand closer to an object with Nanite enabled, the mesh will scale up in polygons. Move farther away from the object or have it out of sight, and that number scales down in order to save processing demand.
There is a bit more to it than just that, but for our purposes that’s all we need to know. What we also need to know is that this process works fundamentally different that how games used to handle high-resolution meshes.
Before, games would load in objects that had several meshes attached to them called LODs (Level of Detail). Stand further away from it, a LOD with a lower resolution was loaded for that object. This required developers and artists to create several LODs by hand and tell the engine how to handle switching between them. This process was intensive, but allowed developers a large amount of control when it came to optimizing the performance of their games.
Nanite doesn’t allow that same amount of control. Yet, at least. Each iteration of Unreal Engine notes that Nanite performance has increased, and that is always something to cheer for. However, we can’t simply give up all other techniques in order to rely solely on Nanite for handling all of our problems. Not yet at least.
Meanwhile, Lumen is sort of in the opposite place.
As a global illumination system, Lumen is meant to capture the performance and capabilities of real-time lighting while providing a result that looks as close to ray-traced as possible. Meaning that we want the light to look as realistic as possible while not needing to wait minutes or even hours for frames to fully render.
As opposed to Nanite, Lumen checked off all of the boxes from day one. It has never been easier to create good-looking lighting in a real-time engine.
The issues arise with how to take that step beyond “good” and into “stunning.” While Lumen massively reduces the amount of parameter adjusting you need to do, it doesn’t just build the lighting you’re looking for. You still need to do that yourself.
So where are we at the end of all of that? With two insanely powerful systems that still require an artist to implement them correctly.
If we’re not at the point of just one-clicking our assets into a beautiful scene, maybe we can go one step further and look at how virtual scenes were created before Nanite and Lumen stole the limelight.
Techniques of Yore
Yore might be an exaggeration, but it does feel like these techniques are often overlooked in favour of the new impressive technology that is being added with every edition of the Unreal Engine. Let’s start off in familiar territories.
Decals
Literally just stickers.
Ok, there’s a bit more to them than just being stickers, but decals are often confused with real geometry and textures. A decal is an image that is applied in 3D space that is projected onto a mesh. These images aren’t just simple pngs that are slapped onto walls and floors, they can be fully fleshed out materials with different maps, including roughness, normals, and opacity in order to truly sell the illusion of detail.
If you aren’t looking, decals can be hard to spot. Which usually means they’ve been applied correctly. Anytime you need to create detail without adding additional geometry or textures, decals should be your go-to option.
Vertex Painting
Every mesh is made up of polygons that have at least 3 lines, each with their own pair of vertexes. When we map out our meshes into UVs, we’re still working with those vertexes, just in a 2D space. As such, we can then paint a mask onto the mesh or the UVs in order to blend between two or more materials.
Why do we want to do this? Most objects, or even components of objects, aren’t simply one material. There’s a plethora of differences to be incorporated, such as blemishes, burns, scrapes, wears, tears, and more. Being able to seamlessly blend between all of these is key to making objects look not only realistic, but believable as well.
If your scene is looking flat, make sure to break up the monotony of your textures to breath some life into it.
Virtual Textures
Unreal Engine provides a specific system for taking the above mentioned Vertex Painting to the next level. You might be able to put time and effort into blending textures for an object, but what about landscapes? Theoretically, we could apply the same approach to blending between landscape materials. But just blending between materials often isn’t enough to make a landscape believable.
Enter Virtual Textures. This tool allows you to not only blend between materials, but assign different foliage and debris to populate the same areas. Add in Nanite for high quality deformations, and you have yourself a very believable landscape.
But Wait, There’s More
More information, not necessarily techniques.
To wrap this edition up, I’d like to discuss three bits of theory for elevating your Unreal Engine scenes.
Trimsheets
Trimsheets are textures that consist of multiple sections that could extend well past their pixel resolutions. Where we have objects that need something additional in order to break up its simplicity, a trimsheet can provide that necessary detail.
Trimsheets can be thought of as younger siblings to tileable textures. Instead of needing to tile in all two dimensions, we only need to tile in one. As such, we have a whole extra dimension or space to add in additional textures, even if they aren’t related to the initial object.
There aren’t any hard rules for creating trimsheets, and it can be tricky to plan them out. It’s worth going through the blocking and initial modeling phases of creating an asset or kit before deciding what segments to include in a trimsheet.
Texel Density
This one used to bite me in the behind anytime I reached the texturing stage.
Once we’ve finished UVing a model, usually we just slap a texture on it and then plop it into the scene. When you stop and think about that process though, you begin to realize that you might want to put a bit more thought into it.
Each model you place in a scene occupies a set amount of space. A car occupies more space than a banana. Therefore, we don’t necessarily need to give the banana the same amount of texture space as we do the car. This is texel density, and is calculated by dividing the size of the texture (say a 4k resolution texture) by the area of the polygons it’s stretched over (the actual measurement of the area of a polygon).
This isn’t something you have to do by hand, thankfully. There exists plugins and add-ons that will calculate this value for you, and some even adjust the size of your UVs in order to make it proportional to a value you can set for the scene. I’m not aware if Unreal Engine specifically has a feature like this, but it’s likely you’ll want to make your assets in something like Blender or Maya which both have add-ons that will do this for you.
Modularity
Modularity extends past the base idea of making things fit together. It consists of explicitly planning how you create your assets so that even things that aren’t planned to go together can.
In practice, this means making objects like wall and floor segments the same size, even if you have multiple variations. Doors are exactly the same height across stylings (even broken ones) and windows fill the same gaps.
This approach makes laying out environments incredibly easy, however it makes the scene look bland. This is where the techniques of decal placement, vertex painting, and virtual textures factor in to make your scene feel more realistic. Because in reality, most objects do start out exactly the same, but each experiences unique conditions that make it look the way it is.
Outro
Hopefully some of these techniques and theory can help you improve your 3D scenes, whether within Unreal Engine or not. Especially with the theory, these aren’t exclusive to Unreal Engine. Well maybe Nanite and Lumen are, but I’m sure other applications have (or are working towards) similar feature sets.
As it turns out, Project TITAN is moving very quickly, and I don’t quite think I’m going to be able to get my stylized skills up to snuff in order to participate. However, the folks in the Discord are very nice in letting me observe their ways. I think towards the end of the project I’ll do a write up on the production methods that are used.
Until next week though, stay innovative!
- Adam