Creating Procedural Masks with Render Targets in UE4

My goal was to generate a mask from the features of a level,which could then be used to control the parameters of a landscape material.

The mask, which looks like this:


Is used to create an effect like this:


This is a very basic example of how I have used the technique, but the same method could be used to :

⦁ Blend Static Meshes into terrain
⦁ Automate Snow/Rain/Dirt/puddle buildup
⦁ Melt Snow in sunlight
⦁ Burn/Melt the ground from magic effects and characters
⦁ Sand Dunes Pile up from a prevailing wind direction
⦁ Grass/Moss/Foliage Spreading

You are only limited by your inventiveness!

So without further ado:

This guide will focus on creating masks from staticmeshes to control how materials and displacement maps are applied to a Landscape. So without further ado:

⦁ Scene Render Capture 2D
⦁ Texture Render Target
⦁ Material
⦁ Landscape
⦁ Some Static Meshes
Step 1. Setting up the Material

Set the material to “Use Material Attributes”, and then create a “Mat Layer Blend Standard”. This node blends two materials using a heightmap as a mask for the second layer. Create two “Make Material Attributes” Nodes, and set these up as would for a standard material, Create a “Landscape Coords” node and set the mapping scale to 1, and plug it’s downstream pin into the “uv” pin of each texture node. Notice that the “make material attributes node is almost identical to the regular Material Output node, except that they have downstream pins which can be plugged into the “layer blend” node. Plug the output of the “layer blend” into the “Material Output” node. You should notice that the preview will now display the material plugged into the “base” pin of the “layer blend.”

Step 2. Prepare the level
Create or import a landscape heightmap, this does not need to have any height information. Apply the previously created material to the landscape. Place Static Meshes “these can be anything” around the level, you can scatter them, or place them as if you were building an actual level. Import a skylight so that you can see the level with basic lighting

Step 3. The Scene Render Capture and Target
Create a Texture Render Target.
You will need to make some modications to the default render capture, so begin by making a child or a copy. Now, edit the properties of the render capture so that it does not capture a render every frame. Click “show flags” and uncheck every box apart from “static meshes”, “landscape” and “skylight”, set the field of view to 1. Rotate the scene render capture so that it is facing down, add a static-mesh plane, set it’s scale to 10000set it to -10000 in the Z-axis, and apply an unlit black material to it, this creates the “black” part of the mask. Add an editable “landscape” variable, and two floats. Now in the construction script, set the x and y coordinates to be at the center of the landscape. Minimize the blueprint and go back to the main window. Set the target of the render capture to the “texture target” you created previously. Increase the “height” variable in the render capture until the landscape fills the texture target with no white borders. Now uncheck the “show landscape” flag.

Step 4. Using the mask in the material
Drag the “render target” into the material, and plug the output pin into the “alpha” channel of the mat layer blend. Create a “landscape coords node” and set the mapping scale to match the resolution of the landscape. Now if you go back to the editor window, you will notice that the second material layer has been applied somewhat randomly to the landscape below. there is a simple fix for this however! Left click on the scene capture actor, and change it’s rotation in 90 degree increments until you find a rotation where the pattern lines up with the static meshes underneath. If this is not perfect, you may want to adjust the height of the scene render capture slightly or change the mapping scale slightly in the material node.

Step 4. Dilating the mask
So far so good, except the mask is not actually very useful because the area that it paints is primarily hidden beneath all of the static meshes. To get around this, we need some way of expanding the mask so that it bleeds out from the static meshes, ideally with a gradual falloff. Instead of blurring the mask in the material, which would require more instructions and potentially some custom code, lets head back into blueprints.
Toggle off “capture on movement” in the scene capture actor, and now back in the construction script, create a branch node with an input boolean variable called “recapture mask” set to “false” by default, and then from the truth node create a node which sets “recapture mask back to false”, now from the “render capture” object, drag out a pin and set the capture mode to “additive” after that make a “for loop” and set the first index to 0 and plug in a variable called “exposure” to last index. Inside the loop, trigger a render capture. From the “complete” pin of the loop set the capture mode back to “replace” This allows us to trigger when the render capture is fired, and to set the number of frames that should be added together to produce the final mask. The on completed node “replace” ensures that when we tick “recapture mask” the results of the new capture replace the old rather than adding them together.

Inside the for loop, after each capture, add a random vector within a range “blur amount” to the location of the scene capture component. This will jitter the camera, and with a high number of exposures will emulate a blur effect.


This method is cheap and effective. The quality of the mask can changed by scaling the size of the texture render target, and by increasing the number of captures that are blended together using “exposure”. The amount of dilation/blur is controlled by the paramater blur amount. Any masks created this way can Saved, exported and edited in an external program.
This method would require some modifcation for widespread use in interior scenes. It would also require some modification in order to capture specific types of meshes. But those steps are a little outside the scope of this simple tutorial, so I will leave them for another time.

I hope that you have found this tutorial helpful, if you can think of any exciting ways of using this technique please let me know and if I have not been clear enough on any details don’t hesitate to leave a comment.


Flamethrower Effect

Doing some work recently on flamethrower effect for a client. This is progress made over the course of two days. I completely changed my approach in the latter half of the second day and I think the results were improved considerably.

Day 1 

Transparent spheres with refraction and a flipbook sprite animation baked from a fluid simulation in maya to create the ‘flames’


  • Overall the effect looks quite naive and unrealistic
  • Can see clear repetition of the fire sprite
  • something odd about the ‘spit’ droplets
  • the fire sprites rotate to face the camera, which looks on in VR
  • There is no smoke coming from the fire

Day 2 Morning

Looked at reference material and tried to copy the motion of the fire more accurately, switched from translucent blending to additive. Locked the ‘spit’ spheres to uniform scaling so that droplets were all perfectly spherical.


  • There is still repetition of the fire sprite
  • Sprites still rotate to face camera
  • The color of the fire is too uniform and does not match the reference
  • No smoke

Day 2 Afternoon

Started using blackbody node in Material Editor, driving the temperature with the red channel of a particle colour node. This was done to more accurately emulate the falloff in heat over time.


  • Repetition is less noticeable with the colour gradient in effect
  • The rotation is still an issue
  • The fire looks like it is dissipating and smoking more believably

Day 2 Evening

Studied reference material more closely, observed that flamethrowers seem to be made up of lots of smaller explosions and flames grow outwards spherically. Replaced the flipbook sprite with tessellation distorted spheres that grow over time. The tesselation uses procedural noise which pans in the +Z direction, (upwards). There is a constant acceleration in the -Z axis to mimic the effects of gravity, but, over the life of each particle, a force in the +Z direction grows until it overcomes the gravity effect, which mimics the convection effect of the fire.


  • There is still repetition, although it is less noticeable
  • As the fire uses geometry, the screen facing rotation is now non-existent
  • The fire looks much closer to the source images.
  • The fire looks good from multiple angles
  • The fire is expensive to render due to lots of overlapping transparency, panning procedural noise and tesselation.

In Conclusion

There is still a bit of work to be done in regards to optimizing the effect for virtual reality. But overall I am pleased with the visual result that I have achieved so far, and I think my approach would be suitable for a variety of effects, particularly those that are intended to be seen in VR headsets where a more ‘volumetric’ look might be more desirable than a flat animated sprite.

Perhaps combining this approach with smaller sprites could produce a greater effect!

The effect isn’t perfect, and I would love to hear from people with criticism. However, It goes to show how much of an improvement can be made in a very little amount of time if you are willing to experiment and close attention is paid to reference material!

Reactive Winter/Spring Environment

This is some work I did over the course of 3 days with the exciting JunoVR, who are working on spaces and tools for a mindful life. The brief was to create a beautiful natural environment that responds to players breathing. The 3 day time-frame was tight, and as a result I relied on assets that were provided for me, assets downloaded from the megascans website, and megascans mixer was used to create some quick, realistic ground and snow materials.

The bulk of my time was spent working on shader effects for reactive wind, particle effects, blending between materials and creating believable breath and snow motion through the use of vector fields.

The scene was intended to give an idea of what sort of projects JunoVR will create in the future, and sets the groundwork for future collaboration, which I am very happy to say will be occurring soon!


There’s a lot that could be improved on, and I can wait to explore the possibilities as we move forwards!


Congress – Northern Sonoran Desert- Arizona

I threw together a scene quickly using the Arizona Desert Assets on the Unreal Marketplace. This was a small paid job done to help an individual developer meet an encroaching deadline. Due to the severe time constrains no custom assets were used, with the exception of the dirt ground material, which I developed using Megascans Mixer. It was a good into to Megascans, and it was fun working with a complete pack of assets.


Although I did not have time to model anything myself, I explored the area using google maps, and created a reference sheet. It would be fun to have a stab at this environment with custom foliage and a higher level of realism!googlemaps2.png


Plant Research.jpg

Some of my favourite articles

So this is a bit of a cop-out from writing myself, although I would like to write out and properly structure some of my thoughts on play, storytelling and world-building at some point.

Here are some of my favourite articles that relate to various different aspects of video games and the thought processes involved in designing and critiquing them.





Planet Earth II

In anticipation of the release of the new series of Planet Earth (which, was absolutely incredible), I watched the underwater episode from the first series. One of the things that particularly stuck with me were the Underwater Vents, so feeling inspired, today I had a quick go at creating them as a particle effect.


So this is what I achieved so far. It’s basically a bunch of spheres spawning very quickly. Scaling with their life and with a drag force applied. On top of that there is a panning heightmap which is fed into the world position node to create the illusion of some secondary motion, and finally, there is a sparkly 512 specular map and a panning 512 bump map generated from procedural noise to give some added fidelity.

The world position offset caused seams to be come visible in the mesh, so I am currently combating this by rotating the mesh to perpetually face teh camera, which does alleviate that issue, but also seems to have an unusual effect on the lighting.

I quickly sculpted some coral-like geo and threw them together to create the test scene above.

There is heavy fringing, some dof, and a bit of grain to suggest an underwater environment.

To improve the scene I would spend more time modelling different coral and rock variations with better textures, add some more ‘wispy’ elements to the smoke, add some heat distortion, and perhaps even a bit of ocean wildlife.






Windy Grass

I’m just playing around at the moment with a wind and foliage in Unreal. The wind in Unreal is pretty simple, but I want to come up with a way of doing trunks, then branches which inherit movement from the trunks, and then leaves and so on. Each with their own layer of wind on top.

So far I have only modelled some very simple foliage and applied a little simplegrasswind to it, which is being driven by scalar values held in a material parameter collection. I wonder if there is a way to control the direction of the wind…