DPI-aware IMGUI

Today’s topic might not be the most exotic, but nonetheless it is something that is very important to get right if you’re developing a desktop application in 2017. Failure to do so will make your application look blurry and amateurish, to an extent that it almost feels buggy. We’re going to talk about how to play nicely with the Windows UI scaling mechanism to give your users the most bang for the buck out of their new high resolution monitors. This is commonly referred to as making your application DPI-aware.

Goals

So what does it mean that your application is DPI-aware? Basically it boils down to not relying on Windows built-in mechanism to rescale the application for you when the user has set some other UI scale factor than 100%.

On my Razer Blade 14in (QHD) I run the Windows UI with 150% scaling

Instead the application takes over the responsibility of keeping track of the scale factor and doing whatever it feel makes sense to give a pleasant user experience — typically this means delivering a sharp user interface that leverages the full potential of the user’s monitor(s).

If your UI is based completely on vector graphics then this pretty much comes for free, as authoring of the UI can then happen in a normalized coordinate system and the only thing to keep track of is what resolution to use for your back buffer, based on the currently set UI scale factor in Windows.

While a lot of stuff in our editor user interface is authored with pure vector graphics we will still have a fair amount of bitmaps, mainly in the form of thumbnails and text. We will soon take a look at how we achieve crispness for those elements, but before we do that we’ll take a quick look at the Windows API for handling DPI-awareness.

Windows API

I always fear the days when I have to do something I’m not familiar with that touches any parts of the Windows API because it usually is such a pain to find the relevant and most up to date documentation online. Luckily though, this time I found an excellent tutorial walking through all the relevant APIs: Writing DPI-Aware Desktop and Win32 Applications.

It turns out implementing solid DPI handling in your own application is actually very simple and straightforward. Especially if you don’t have to worry about any third party component also playing nicely with the API. Sure, parts of the API are a bit weird (especially functions like: SetThreadDpiAwarenessContext() feel wacky — it globally changes the DPI-awareness settings, but only for the calling thread…), but overall it’s rather simple.

The first thing you want to do is to globally change the DPI-awareness setting for your application, this is done by calling:

SetProcessDpiAwareness(PROCESS_PER_MONITOR_DPI_AWARE);

somewhere early in the initialization code of your application. This disables windows built-in coordinate scaling mechanism and hands over the control to you. Note: it is important that you do this before you start doing any windows creation or similar.

The second thing you need to do is to figure out the UI scale factor for each window you intend to create. The core idea behind Windows UI scaling is that 100% scaling maps to 96 DPI. This is defined in WinUser.h:

#define USER_DEFAULT_SCREEN_DPI 96

For other UI scale factors (125%, 150%, 175%, 200%, and so on) the DPI is just linearly scaled:

Scale DPI
100% 96
125% 120
150% 144
175% 168
200% 192

As the scale factor can differ per monitor you cannot globally poll the scale factor and be done with it. Instead you need to keep track of the scale factor per window and gracefully handle rescaling if a window moves between monitors with different UI scale factors.

We’ll start by enumerating all connected monitors using EnumDisplayMonitors() then for each monitor we poll its DPI setting using:

uint32_t dpi_x, dpi_y;
GetDpiForMonitor(monitor, MDT_EFFECTIVE_DPI, &dpi_x, &dpi_y);

As you can see the API can return a different DPI for the x- and y-axis, but in practice I can’t see any use cases where that is ever needed so I suspect it’s just the API that is a bit over-engineered. I might very well be wrong about this but until someone points that out we’ll continue to rely on them always returning the same value and just pick dpi_x as the truth. From that we can reconstruct the scale factor:

float dpi_scale_factor = (float)dpi_x / (float)USER_DEFAULT_SCREEN_DPI;

We now have everything we need to create a window on a certain monitor and associate it with a dpi scale factor.

To simplify development of the editor UI we’ve decided to work in pixels with the upper left corner being [0,0]. Also, as we don’t want to be bothered with rescaling the pixel coordinates to take into account the per monitor DPI setting, the editor UI works in a virtual coordinate system that maps 1:1 to pixels when the DPI scale factor is 1.0.

However, our cross platform API for dealing with desktop windows in The Machinery always reasons about pixels for clarity. So for convenience we’ve exposed a small helper method for transforming a rectangle between virtual coordinates and pixel coordinates:

// Adjust rect behavior:
enum {
    // Go from virtual coordinates to pixels
    TM_OS_WINDOW_ADJUST_RECT_TO_PIXELS,
    // Go from pixels to virtual coordinates
    TM_OS_WINDOW_ADJUST_RECT_TO_VIRTUAL,
};

// Helper function to transform window rectangle between virtual coordinates and pixel coordinates based on a dpi scale factor
void (*adjust_rect)(int32_t *x, int32_t *y, uint32_t *width, uint32_t *height, float dpi_scale_factor, uint32_t operation); 

We use that to rescale the virtualized size of our editor windows to actual pixels before creating the windows.

The last thing we need to do is to make sure we catch the windows message WM_DPICHANGED in our message pump. WM_DPICHANGED is triggered if the center pixel of a window is moved between monitors with different DPI scale factors, or if the user changes the UI scale factor for a monitor while the application is running. If a window straddles between two monitors with different UI scale factors the monitor containing the center pixel of the window dictates the scale factor.

If triggered we get the DPI of the monitor that the window was moved onto in the hi-word of the WPARAM and can simply calculate a new DPI scale factor using:

float dpi_scale_factor = (float)HIWORD(wparam) / (float)USER_DEFAULT_SCREEN_DPI;

And then use that to resize and reposition the window accordingly.

With that we have everything covered on the low-level side of things. Each window can be polled for its DPI scale factor and they are correctly resized if moved to a window with a different scale factor or the user changes the monitor scale factor while the application is running.

Now let’s move on and take a look at how our IMGUI system utilizes this to make sure the UI renders crisp.

Vector graphics

All pure vector primitives (i.e., rectangles, lines, splines, clipping rectangles, etc) are simply rescaled from the editor’s virtual coordinates to pixels using the DPI scale factor polled from the window. To maintain our IMGUI system simple and clean we’ve opted to keep it unaware of the whole concept of DPI scaling. Instead we simply apply the DPI scale in the vertex shader used when rendering the UI.

Text rendering

For text rendering we use bitmap font atlases dynamically generated from TrueType fonts using Sean Barrett’s awesome [stb_truetype.h](https://github.com/nothings/stb/blob/master/stb_truetype.h). The font size is specified in points and remapped to pixels based on the DPI scale factor of the window and the knowledge that TrueType fonts assume the point size being specified in 72 DPI. The remapping is done as follows:

static float font_points_to_pixels(float pt, float dpi)
{
    return ceilf(pt * dpi / 72.f);
}

While I’m not a 100% sure that this is the right way to do the remapping the results looks correct when comparing to other Windows programs.

We use 2x2 oversampling when generating the font bitmaps and stash all generated bitmaps in a cache where the cache key encodes:

  • Which font we use.
  • The size of the font in points.
  • View DPI, i.e DPI of the monitor currently displaying the window rendering the text.

Currently we don’t do any preheating of the font cache as stb_truetype appears to be pretty fast, but that would be simple to add as an optimization if we discover we need it at a later stage.

Just as with regular vector graphics all DPI awareness scaling happens in the vertex shader. The position of a text field is specified in virtual coordinates, while advancing from one glyph to the next is done in pixels coordinates.

Below you can find a quality comparison between rendering some text in DPI-aware manner (left) vs without being DPI-aware (right).

Font comparison

Thumbnails

As we plan to generate all thumbnails needed by the editor using the runtime, we can follow a similar scheme as for text rendering. We simply specify all thumbnail sizes in actual pixels rather than using the “virtualized size” before requesting the runtime to render them. That way they will stay crisp as long as the input source(s) has high enough resolution.

Viewports

Last thing I’d like to cover is the behavior of 3D scene viewports when displayed in a window that has a DPI scale factor above 1.0. While you obviously would get the best quality by rendering them following the same scheme as for thumbnails (i.e., native resolution) I would argue that the default behavior probably should be to use the “virtualized resolution” of the viewport.

My main argument behind this is for performance on laptops. There are lots of laptops that ship with small but high resolution displays nowadays. A good example is my 14” Razer Blade with QHD (3840 x 2160) display. Even I, who love keeping my text barely readable (to fit lots of information on screen) can’t run that thing without putting Windows UI scaling to 150%. Keep in mind that most laptops also have weaker GPUs than typical discrete desktop GPUs and that you want your application to be well behaved even when the laptop is running from battery. Rendering at native resolution will put a lot of strain on your laptop.

But obviously you want to make this configurable through a settings dialog somewhere so the user can change the default behavior if they prefer (maybe even on a per-viewport level).

Also note that the resolution we are talking about here should be considered the final “back buffer” resolution used when rendering 3D scene viewports. That resolution is typically decoupled from the resolution of the various render targets you would render your 3D geometry / post processing results into.

Wrap up

Handling multi-monitor setups with various DPI-scaling per monitor cleanly doesn’t have to be tricky. However, being in control of all components that goes into your editor framework helps tremendously when implementing something like this. I’ve seen people struggling to to get this right when depending on various frameworks for doing editor UI, and I think that is also the main reason that there are still a lot of applications out there that don’t behave well when moving their windows between multiple monitors with different DPI scaling.