Borderland between Rendering and Editor — Part 3: Selection Highlighting

We’ve reached the third post in this series about often overlooked rendering features typically needed when building editors for 3D content creation. In the previous two posts we have covered grid rendering and picking, go check them out if you haven’t already.

In today’s post we will take a look at our selection highlighting effect:

The Machinery uses a yellow outline effect to indicate the editor selection state.

As you can see we render a yellow outline around the selected objects in the scene. Using some kind of outline effect is a pretty standard approach for doing selection highlighting and is used by most of the game engines I know of. As you will see it’s fairly straight forward to implement but as Google returned surprisingly little information on the subject, I thought I would share the details of our implementation.

Before we begin let’s list some of our requirements:

  • Outlines should be easy to read, feel crisp, and have decent anti-aliasing.

  • Outlines should maintain a constant width of 1 pixel.

  • Outlines should be composited after post-processing.

  • If the selected object is covered by an object that’s not part of the selection, we still want to display the outline but make it appear a bit dimmed out.

  • Any shader should be able to easily implement support for displaying the outline effect.

From the editor’s perspective, we want to keep things as simple as possible, essentially we have a set of entity IDs holding the selection state that we pass to all renderable components.

To make it simple to enable outlines for a renderable component (i.e., a component implementing the tm_ci_render_i interface) we rely on using a shader system. In the inner loop of the component, we simply check if the ID of the owning entity is present in the selection state. If it is, we enable a shader system called selection_system. Any shader implementing support for the selection_system will activate an additional rendering pass if the system has been enabled for that particular draw call. This additional pass will render the object into a selection buffer with its own depth stencil target. The selection buffer is a single channel render target that the shader writes some kind of unique ID to. Currently, we are using the lower eight bits of the Entity ID as we already have that available.

Since we don’t have any dependencies on the rest of the scene, we can render to the selection buffer at any point during the frame as long as we are done before we get to the composition stage of the outline effect.

Composition

The actual outline effect runs as a fullscreen pass blended on top of the output color buffer for the viewport. The shader expects to have access to the selection buffer, the depth buffer associated with the selection buffer, as well as the regular depth buffer used when rendering the scene.

We begin by generating an anti-aliased alpha value for the outline:

// Generate outline by comparing IDs between current pixel and surrounding
// pixels. This will collect 4x4 IDs but we will only be using the upper
// 3x3 taps.
float4 id0 = sel_id.GatherRed(clamp_point, input.uv, int2(-2, -2));
float4 id1 = sel_id.GatherRed(clamp_point, input.uv, int2( 0, -2));
float4 id2 = sel_id.GatherRed(clamp_point, input.uv, int2(-2,  0));
float4 id3 = sel_id.GatherRed(clamp_point, input.uv, int2( 0,  0));

// Throw away unused taps and merge into two float4 and a center tap
id2.xw = id1.xy;
float id_center = id3.w;
id3.w = id0.y;

// Count ID misses and average together. a becomes our alpha of the outline.
static const float avg_scalar = 1.f / 8.f;
float a = dot(float4(id2 != id_center), avg_scalar);
a += dot(float4(id3 != id_center), avg_scalar);

So basically what we are interested in doing is to compare the ID of the current pixel with the IDs of the surrounding 8 pixels. The more they differ, the closer we are to an outline edge. Instead of fetching 9 individual samples from the selection buffer (sel_id), we use GatherRed().

If you haven’t used any of the Gather4-instructions before the concept is pretty simple. Instead of issuing 4 point sample instructions, we can rely on the texture filtering hardware to fetch 2x2 texels from a single color channel (in this case red) and return each tap in the X, Y, Z, W-components. One thing to watch out for is that the order of the returned taps can be quite easy to mess up, they are returned in counter-clockwise order as follows:

X: (-0.5, +0.5)
Y: (+0.5, +0.5)
Z: (+0.5, -0.5)
W: (-0.5, -0.5)

While this is enough to generate a decently looking outline we also want to dim the outline if it’s behind an object that isn’t part of the selection state.

A simple and brute force take on how to achieve that would be to just sample the scene depth and the depth of the selection buffer. If the depth value of the selection buffer is behind the scene depth buffer, just multiply the alpha value (a) with some factor to dim the outline.

Unfortunately, things aren’t quite that easy. First of all, we want the outline to be able to bleed over the selected object and since the selected object is likely to appear both in the scene depth buffer and the selection depth buffer, the brute force approach will start dimming it as soon as it intersects the object. But our bigger problem is TAA.

We use Temporal Anti-Aliasing as our default anti-aliasing method in The Machinery, which means that our scene depth buffer is not temporarily stable as every frame we are applying a different post projection sub-pixel jitter to all our vertices when rendering our scene. So as soon as we need to do any kind of depth testing against the scene depth after we have resolved TAA we’re doomed to have a flickering problem. Our outline effect is no exception.

One little hack that has proven fairly successful to hide most of the flickering and at the same time allow the outline to bleed over the edges of the selected objects is to take more samples from the selection depth buffer around the current pixel and then use the depth value closest to the camera when doing the comparison to the scene depth:

// If alpha is zero, early out.
[branch]
if (a == 0) {
    output.color = float4(0,0,0,0);
    return output;
}

// To allow outline to bleed over objects and combat TAA jittering artifacts
// sample depth of selection buffer in a 4x4 neighborhood and pick closest
// depth.
float4 dtap0 = sel_depth.GatherRed(cp, input.uv, int2(-2, -2));
float4 dtap1 = sel_depth.GatherRed(cp, input.uv, int2( 0, -2));
float4 dtap2 = sel_depth.GatherRed(cp, input.uv, int2(-2,  0));
float4 dtap3 = sel_depth.GatherRed(cp, input.uv, int2( 0,  0));
float d0 = max(dtap0.x, max(dtap0.y, max(dtap0.z, dtap0.w)));
float d1 = max(dtap1.x, max(dtap1.y, max(dtap1.z, dtap1.w)));
float d2 = max(dtap2.x, max(dtap2.y, max(dtap2.z, dtap2.w)));
float d3 = max(dtap3.x, max(dtap3.y, max(dtap3.z, dtap3.w)));
float d = max(d0, max(d1, max(d2, d3)));

// Sample scene depth, scn_depth holds linear depth.
float scnd = scn_depth.Sample(cp, input.uv).r;

// Linearize d and compare it with scene depth to determine if outline is
// behind an object.
float2 near_far = load_camera_near_far();
bool visible = linearize_depth(d, near_far.x, near_far.y) <= scnd;

// If outline is hidden, reduce its alpha value to 30%.
a *= visible ? 1.f : 0.3f;

output.color = float4(outline_color, a);

The above code is fairly self-explanatory, one thing might be worth clarifying though. As we use reverse depth in The Machinery we use max() instead of min() to find the clip space depth value closest to the camera. The depth values stored in scn_depth are stored as linear depth values, hence we need to “linearize” d before we compare it with scnd.

That’s all there is to it really. A future improvement would be to support different outline colors depending on the editor’s selection state. That would require us to write the outline color (or some kind of identifier of the color) into the selection buffer and do a weighted sum of the neighborhood pixels in the fullscreen pass to determine the final outline color. Feels like low-hanging fruit but I haven’t bothered implementing it as we so far haven’t had the need for that.

This wraps up the final part of this series. Hope you’ve enjoyed it.