Safe Zones

Q: What do an iPhone X and a 50 year old CRT have in common?

A: Each has a portion of the display that you can’t effectively use (though for entirely different reasons).

History lesson
CRT screens are curved, they don’t have perfect square corners, the beam overscans a bit, and many did not have variable focus to keep the beam sharp across the entire screen.  All of these factors combine to cause problems at the edge of the screen, but just how much varies from screen to screen.  TV standards resolve this by defining safe zones guaranteed to be usable on most any TV.

LCDs and other modern display technologies don’t have the same limitations as CRTs, but to maintain compatibility with content made for CRTs, many TVs will take the input and overscan it and then crop it, causing the exact same problem (and preventing digital sources from being displayed at 1:1 to boot).  Depending on your TV, you might be able to reduce or disable the effect when playing games (sometimes it depends on the input being used, with VGA or DVI being displayed as-is, and HDMI being overscanned), but you can’t count on that working for your users.

Safe zones and games
These safe zones have been an issue that console developers have had to deal with since the beginning, but it’s relatively new to mobile developers as phones add notches and wrap-around screens. As long as you paint within the lines, everyone will be able to see it.

Typically the safe zones are split up into two concepts:

  • Title safe area – This is where text is fully readable, anything important to the player or viewer should be displayed here.
  • Action safe area – This is at least as large as the title safe area, and may be larger.  The area between this and the title safe area is still probably visible, so you’d want things like background graphics to still extend here.

For TV the distinction matters, but for the most part in games it’s unimportant.  Make sure all UI elements stay within the title safe zone and you’re probably OK with all of the 3D elements filling the full screen.  The only time when you might want to distinguish between them (assuming the platform you’re on even makes the distinction) is if you have a fixed camera view: you don’t want the player (or threats to the player) to travel beyond the action safe zone (because they might become invisible in the overscan), but clamping them all the way to the title safe zone might feel a little bit conservative.

How to paint within the lines in UE4
If you’re using UMG for all of your UI, USafeZone is your friend.  It’s a container panel that should be used to wrap your HUD or the important aspects of your front end UI, and it lets you pick if you want it to adjust to match the title or action safe zones.  You don’t need a bunch of them, just one at the top level (typically in the widget you directly add to the viewport overlay stack).

USafeZone uses FDisplayMetrics to query the safe zone for the current platform, in case you ever need to directly access the safe zone information in C++.  On platforms that have no native safe zones (like PCs), you can use the console commands r.DebugSafeZone.TitleRatio and r.DebugActionZone.ActionRatio to simulate one. These both range from 0..1, and default to 1 indicating 100% of the display is safe.  A typical test value would be around 0.9 but an important caveat is that safe zones were originally designed for full screen use, so these percentages are only accurate if you test in fullscreen.  For quick spot checks it doesn’t matter, as you’re just verifying that all the important stuff moved inwards, but when fine tuning layouts you probably want to see something representative by going to windowed fullscreen or at least maximizing the window.

You can also visualize the current safe zone (on any platform) using r.DebugSafeZone.Mode, which can draw a translucent border on the screen:

0: Do not display the safe zone overlay
1: Display the overlay for the title safe zone
2: Display the overlay for the action safe zone.

Not using Unreal?
I can’t help you for consoles without breaking NDAs, check the vendor documentation for what the certification requirements are  and how to query the current safe zone.

For iOS, it’s a new concept starting in iOS 11: UIEdgeInsets is an anchor term to search for.  I’m not sure if Android has an equivalent API yet, but it’s going to be important soon.

Dwarf Orders

Dwarf Orders is a local multiplayer party game, controlled with your phone and visualized on a shared screen where you send orders to a bunch of unruly dwarven miners, hoping they accomplish the shared goals.
In classic tradition we tried to do something ambitious tech-wise for the 2018 GGJ and finished the basics just in time, without much room for polish or fun.  We plan on trying to reuse the party game infrastructure at a future jam; should go much quicker the second time around.
How to play:

Download the shared screen game from the GGJ site, unzip and run DwarfOrders.exe, then follow the instructions on-screen to connect one or more phones to the game.  Press “Force Start” when ready.

Tap on dwarves then tap again to transmit orders to them to go somewhere.  They’re a bit unruly, so they might not do what you expect after reaching their ordered destination.

Jammers:

  • Michael Noland
  • Nick Darnell
  • Danny Gallagher

Additional assets:

  • “Industrious Ferret” Kevin MacLeod (incompetech.com)
    Licensed under Creative Commons: By Attribution 3.0 License
    http://creativecommons.org/licenses/by/3.0/
  • Various assets from UE4 learning examples, licensed for use with UE4

Molecular Ignition

Click your way to stimulation.

This was my entry for LD39 and #clickclickclick, ish.  Download it from itch.io (the project files are there too).

Note: I realized my core game idea wasn’t really going to work pretty early on, and decided to focus on ‘feel’ and learning Cascade instead, so there’s a lot of juiciness but little substance (more of a toy than a game, though it is possible to fail).

Built using Unreal Engine 4.16.2 entirely in Blueprints.

Music Credits:

Hypnothis Kevin MacLeod (incompetech.com)
Licensed under Creative Commons: By Attribution 3.0 License
http://creativecommons.org/licenses/by/3.0/

Zen Miner

Mine, Sell, Upgrade, Repeat

Zen Miner is meant to be a chill mining game in the style of Miner Dig Deep where a fox collects ore and sells it to purchase upgrades, allowing them to go deeper and find better ores. This fox is all about the vicious loot cycle.

Created for #cgajam using CGA palette 1 and a nearly CGA 320×180 virtual resolution (dithering happens at a higher resolution however, see this article for details on the post processing).

However, the gameplay was not really finished in time for the submission deadline; the uploaded build has the very basics of the gameplay loop (you can mine, sell ore, and refill your lamp oil, but there are no upgrades or deployable items yet).  If you still want to play it, download it from itch.io.

Additional credits:

  • The background graphics and sprites are from Luis Zuno‘s “Sunny Land” pack (CC-BY 3.0)
  • The music track (“Yokushun”) is by Rolemusic (CC-BY 3.0)
  • Created using Unreal Engine 4.16

Creating simple tools in UE4 – part 1

In this series of articles I’m going to show how to create some high-impact, low-effort tools to help develop your game in UE4.  Tools range from simple batch files or humble debug commands to complex ensembles like the UE4 level editor or Max/Maya; but for now let’s focus on some approaches that don’t require a broad knowledge of Unreal Engine or Slate to pull off. Some of these topics deserve a post of their own, but I’m going to start with more of a survey approach and see what people are most interested in digging into.

Programming is all about strategic laziness: the guiding philosophy developing tools is to spend time now to save someone time later, improving overall developer efficiency. This might play out by allowing the user to finish the same tasks in less time or it might allow them to perform more iterations in the same amount of time, increasing polish level and fun factor. Ideally you also save more time than was involved in making the tool, but that is not a hard requirement due to other factors like user morale, making it possible for a different group of people to do the work, reducing mistakes when the cost of failure is high, etc… This is especially true when it comes to automating key processes where a mistake might cause downtime or otherwise hurt your customers, e.g., releasing a broken update or corrupting a database.

Continue reading “Creating simple tools in UE4 – part 1”

CGA Post-Processing in UE4

I spent some time last weekend prepping for #cgajam by playing around with different materials / effects to constrain a UE4 game to CGA colors, and came up with a drop-in post processing BP.

Small preview, click here for full view

How to use

Download the template (requires Unreal Engine 4.16, licensed as CC0) and unzip it.  You can either open the project directly and migrate the Content/CGAJam folder into your project or drop the whole folder into the templates directory of your 4.16 installation (e.g., C:\Program Files (x86)\Epic Games\UE_4.16\Templates) and then create a new project (info on how to make a project into a template here).

There’s an example map but all of the magic is wrapped up in BP_CGAPostProcess; just drop one into your level and try playing around with the settings to suit your content. You can pick one of the two palettes, adjust how much color-space dithering and screen space dithering there is, and tweak the overall gamma / brightness. For the hand painted environment I found 50% screen dither + 50% LUT dither at level 3 to give me the best results, but for the example scene with solid shapes and color fills, 100% screen dither with no LUT dither looked better.

Implementation details

CGA had a variety of different modes and palettes, but the most memorable for gaming was the 320×200 graphics mode using either palette 0 (black, red, green, yellow) or palette 1 (black, magenta, cyan, white).  Color 0 (black) is actually adjustable to any of the 16 CGA colors, though it doesn’t appear to be allowed for #cgajam. The red and green components of colors 1..3 are the same in both palettes as well, only the blue color differs.

UE4 has a lookup-table texture (LUT) based color grading system, but there’s no way to control the filtering used to sample it, so it can’t produce ‘crisp’ CGA colors.  It’s still useful to get colors to nearly the right place though. I’ve included 4 LUT textures for each palette with varying color-space dithering (0%, 25%, 50%, 75%); you can adjust the strength of the color grading as well to vary the effect.

Color grading LUTs for palette 0

The post-process material (M_ForceCGA_PostProcess) has a screen-space dither pattern which dims based on brightness prior to matching to one of the four colors in the active palette.  The color matching (MF_ForceToCGA) and dithering (MF_DitherBrightness) are split up into separate material functions to make them easier to reuse, e.g., in the UI material.

The actual color matching is done using a cheap lookup texture with red->U and green->V (the 2×2 texture is set to uncompressed, clamped instead of wrapped uvs, nearest filtering, etc… to give us a clean and exact output value).  We can ignore the input blue entirely as mentioned above, and multiply the output blue with the correct value for the current palette (dim it for palette 0 and keep it as-is for palette 1).

MF_ForceToCGA
MF_DitherBrightness

User Interface

For the UI, you’ve got two options:

  • Stick with the CGA colors, which need to be specified differently in the UI than in the world since UI happens in a different color space, e.g., yellow is (1,1,0.092) instead of (1,1,0.333) (the correct final color will show up in the hex sRGB setting of the color picker).
  • Get in the right ballpark and use a retainer box with the Effect Material set to MI_ForceCGA_UI_Palette0 or MI_ForceCGA_UI_Palette1 to guarantee that you only use the correct 4 colors (recommended approach).

    Using a retainer box

Avoiding flickering artifacts

There are a couple of steps in the default UE4 post-processing chain that introduce temporal artifacts in order to improve overall image quality when doing realistic rendering.  These steps become very noticeable and objectionable when you do something like threshold down to 4 colors, so the example project disables them (setting r.Tonemapper.GrainQuantization to 0 and disabling post-process Anti-aliasing entirely, though you could also switch from temporal AA to FXAA instead). The DefaultEngine.ini changes to do this won’t transfer over if you just migrate the content from one project to another, so you may need to paste in:

[/Script/Engine.RendererSettings]
r.DefaultFeature.AntiAliasing=0
 
[SystemSettings]
r.Tonemapper.GrainQuantization=0

Resolution

Want a lower resolution that is a bit more like true CGA 320×200? Adjust the Screen Percentage on the post process component and set r.upscale.quality to 0 (nearest filtering) in the [SystemSettings] section of DefaultEngine.ini

Ideas for further improvement

Try combining it with an edge detection material (either on depth or on custom stencil if you want to limit the effect to foreground objects) to put black outlines around things.

Try other spatial variation patterns besides a 2×2 checkerboard; techniques like Real-time hatching would probably work quite well.

Credits

The map used in the example shots is from the “Hand Painted Environment” marketplace pack by Evgeniya Yaremko.

Customizing a UE4 project – launcher thumbnail and splash screens

The Epic Games Launcher shows a thumbnail for each project, looking in two places: first in the root folder of the game for [ProjectName].png (next to the .uproject) and then Saved\AutoScreenshot.png, which is created by the editor automatically. You can create your own thumbnail (192×192 .png) and set it using the “Project Thumbnail” option in the Description category of Project Settings.

You can also customize the splash screen shown when the editor or game is starting up (separately for editor vs game, windows vs mac) in the Windows / Mac categories (under Platform) of Project Settings. The splash screens need to be .bmp (not .png), but they can be whatever size you want within reason (the startup text and project name are overlaid on top of it, so you’ll probably want to leave space for that).

You can pick application icons and mobile startup screens in the various platform settings options pages as well (the rules for these are varied and some platforms require a wide range of sizes, but it’s all listed in the corresponding settings pages and platform vendor web sites).  Due to the variety of required splash screens for a universal iOS app, the size can start to add up noticeably, so it can be worthwhile to approach them differently, using large fill areas that will compress well, run pngcrush on them, etc….

Holographic foil card effect

Earlier this week I saw a really neat effect where @Sererena simulated the view-dependent shimmering of a foil card, fed by a smartphone accelerometer.  This seemed like a good challenge to improve my material creation skills in UE4, so I gave it a go yesterday and came up with something I’m reasonably happy with. The view angle is controlled by WASD rather than the accelerometer since I was more focused on the material.

Download

You can download the assets and test project (built with Unreal Engine 4.15), licensed as CC0.

Construction

The material has 8 layers:

  • Background
  • Distant large stars
  • Small stars / motes
  • Near large stars
  • Character foil outline
  • Character
  • Foil on character accents
  • Card border
Outline of foil card material (M_FoilCard) showing the various layers

The foil character accent layer and the star layers all parallax as the view angle changes (shift around, with a larger shift for nearer layers, faking a sense of depth).

The foil layers (everything except the character) are colored by a lookup into a ‘holographic plate’ texture, using an ad hoc function of the tex coord, distance, and tilt amount.

The character accents layer is done with a color notch filter and one star layer, directly added to the character color (rather than lerping to it). This is the part I’m least happy with in the current implementation, but I’d need to test it with a real character to see exactly what I’d want to change.

The material is built using a bunch of material functions as building blocks to make it cleaner, but these aren’t meant to ‘stand alone’; there are lots of assumptions built into them about textures and parameter values.

I’m using a live capture of a 3D character instead of a 2D base, but after the capture Blueprint updates the render target, everything else is the same either way. The capture BP is an opaque box containing the character, a capture component, and a controlled light source.  Even with setting the capture component to only capture the self actor and setting an aggressive culling distance, some ‘global’ things like atmospheric fog still render into it, so I had to use the mode that gave me depth in alpha rather than an opacity mask in order to discard it.

The outline is done with a Sobel edge detection filter on the thresholded depth from the character render target (which has depth as alpha). Rather than using a kernel of the adjacent texels, it uses texels 8 away to increase the width of the outline. If I were using 2D characters I’d probably skip the runtime Sobel filter and do that offline once into a mask texture.

Planetary Garden

Ludum Dare 38:

For one reason or another I’ve never managed to participate in a Ludum Dare before this past weekend (LD 38). Planetary Garden was made under the compo rules (worked alone, created all new assets, finished within 48 hours of starting). However, I got a late start (3 PM Sat) and did not submit in time for the official deadline, so it’s technically a jam entry.

The theme was “a small world”, and I started riffing on various ideas for many connected small worlds: perhaps a platformer on a circular planet or mini golf where you switch worlds.  These ideas didn’t quite jell or seemed out of scope considering compo rules, and incremental games have been on my mind as well, so I ended up going in that direction (tho Planetary Golf seems like it has legs, might revisit it in the future).

You can download the compiled version for Windows as well as the project with source code and assets at itch.io, and vote / leave feedback on the LD 38 project page. Unless otherwise noted, assets created for the compo are placed under a CC0 license and game code under a MIT license.

Art Style:

Entering the compo means that you have to make all of your own art and audio (with some very limited exceptions for fonts and brushes/samples, etc… that are suitably transformed). I’m not a particularly good artist, so I went with a watercolor artistic style that let me leverage materials for most of the impact, and created the actual art as simple RGB masks (outer border, inner border/outline, and fill color). I wasn’t going for realism but just something evocative of watercolors, with variations in opacity and splotichiness generated by a couple of noise samples operating in screen space so it looks coherent across objects.

How to play:

  • Repopulate a barren planet (Earth?) with life by seeding new plants
  • When the time is right, enlist animals to help automate the process (one of them knows an ancient secret).
  • Eventually other things happen. No spoilers.
  • Note: Progress is saved between sessions, but time does not advance while the application is closed. Keep it open to continue amassing life.