Don’t Leave the Engine Running

By simply adjusting image output while players were idling, Fortnite was able to save a whole wind farm’s worth of electricity.

The following content was taken from Epic’s Reducing Fortnite’s power consumption documentation. All credit to the authors!

In Fortnite, players can spend a significant amount of time in the front-end screens such as the Lobby and the Item Shop.

We did some profiling and saw that GPU load could be almost as high in the Lobby as it is in game. The front- end scenes are simpler and so require less GPU time at a given resolution, but dynamic resolution increases to compensate for this, keeping the power consumption high.

We decided to implement a solution based on inactivity. If a player is in the front-end screens and hasn’t touched an input device for an extended period of time, we can reduce the resolution and/or frame rate without it being noticeable.

Experiments

Microsoft provides a “power percentage” metric in their PIX profiling tool. This estimates the power draw of key hardware components on an Xbox Series X dev kit. We tried reducing resolution from 75% to 50% and halving the frame rate (to 30 fps) in the Fortnite Lobby and measured the impact in PIX. The adjusted settings reduced the power metric by around half.

The PIX power percentage metric showing different energy saving configurations on an Xbox Series X

It’s worth noting that the metric is not proportional to overall power consumption at the socket. It measures the power draw of certain key components (such as the CPU and GPU) since these are the main components that a game can influence. The metric is a useful tool for iteration, but if you’re interested in measuring overall power consumption, we recommend using a socket-based watt meter.

Energy mode settings

We targeted the biggest savings at the high-end current generation consoles (PS5 and Xbox Series X). These platforms render at 4K output resolution and use TSR for upscaling, which provides some flexibility to reduce the input resolution. We did a visual review dropping to 50% resolution in the Lobby after a period of inactivity and the difference was hard to perceive. This type of scene with a fixed camera and slow-moving animation is an ideal case for TSR upscaling.

The table below shows the final settings we shipped with for the various consoles and their corresponding power consumption reduction.

The inactivity time threshold we used on all consoles was 30 seconds. Energy-saving mode would become active after this time.

As an example, we measured an Xbox Series X at the wall socket; when running in energy-saving mode,

we measured 117 W, versus a baseline of 184 W. In cases where we were unable to drop resolution, relative savings were lower. For example, Xbox One measured 79 W in energy-saving mode compared to a baseline of 98 W.

*The 75% maximum resolution here predated our change to globally reduce the maximum resolution to 65%, as described in the previous section.

 

Implementation example

We provide some example code for handling inactivity-based energy saving in a UE5 game in front-end screens. This is intended as a starting point for a game-specific implementation. Developers may wish to expand this and incorporate support for their own game-specific logic.

The inactivity time threshold and the frame rate and resolution can be configured via console variables. This enables you to override per platform via DeviceProfile ini files:

We can get the last interaction time using the method FSlateApplication::GetLastUserInteractionTime(). By comparing this to the current time, we can determine how long it’s been since a player interacted with the game. If the elapsed duration is over our configured threshold, we enable energy saving. The code below does this, but also overrides the last interaction time if the player is in game. This ensures that energy saving settings are only enabled in the front-end screens.

LastInteractionTime and bEnergySavingModeEnabled are member variables used to track the timestamp of the last interaction and whether energy saving is enabled, respectively.

When bEnergySavingModeEnabled changes, we update the frame rate and resolution accordingly, storing the old value when the mode is enabled, and restoring it when it’s disabled again. The member variables MaxScreenPercentageToRestore and MaxFpsToRestore are used to store the old values.

We set the console variable t.maxfps in order to limit the frame rate. This throttles on the game thread, ensuring the frame rate doesn’t exceed the specified value. The console variable r.DynamicRes.MaxScreenPercentage is used to limit the maximum resolution (Note: There is a better alternative to this in UE 5.3—see below).

Multiple energy-saving modes

Games may also wish to implement multiple levels of energy-saving support. Fortnite’s initial energy-saving
release had two modes: a 60 fps low mode which reduced only resolution, and a 30 fps high mode which reduced resolution and frame rate

 

Adding PC support

PC support was added soon after the console release. This worked in a similar way to consoles, but we also added focus detection, so we could enable energy saving when the game was in the background. This enabled us to reduce power consumption both in the front end and in game.

Note: UE5 developers can call the FApp::HasFocus() to detect the current focus state of the game.

We defaulted idle and focus detection to on, but to enable content creation workflows we enabled players to opt out, with separate settings for idle detection and focus detection.

Energy efficiency options in the PC settings menu

 

Because PC players have a variety of anti-aliasing options to choose from, we could not guarantee a consistent quality when reducing the resolution. For this reason, PC energy saving does not affect resolution, only frame rate. This is something we may explore in future for players with TSR enabled, however.

PCs include a wide variety of hardware, so the total impact on power consumption is much harder to quantify than on consoles. However, the GPU remains one of the main contributors to power draw across all hardware, and so it’s reasonable to assume that these changes will make a significant difference to energy use.

 

Experimenting with the livegame

Fortnite’s final settings were reached after a number of iterations on the live game, and we rolled it out in a staged manner, starting with Xbox Series consoles.

We were interested to gauge player response and also gather data to see how long players were spending in the energy-saving mode in the front end. We added telemetry to our code enabling us to see how long energy-saving mode was enabled.

The Xbox team at Microsoft provided live telemetry showing the average power consumption of our title across the whole Xbox Series player base. This data is based on a sampling of retail consoles in the live environment. This information was extremely useful in order to guide our efforts. The chart below shows a section of the data from Series X consoles.

Our initial energy saving settings were relatively conservative and focused purely on front-end inactivity: After one minute in the front end, we dropped the maximum resolution to 60% (from 75) and after two minutes, we dropped the resolution to 50% and reduced the frame rate to 30 fps. The first drop in power consumption in the chart below corresponds to these settings.

After analyzing the data and reviewing player sentiment, we decided to take a more aggressive approach. We adjusted the front-end energy saving settings to kick in fully after 30 seconds, dropping to 30 fps and 50% resolution; we also reduced the maximum resolution of the game from 75% to 65% (as described in the section Analyzing Dynamic Resolution in Fortnite). These changes correspond to the second drop in power consumption in the chart below.

 

Average power consumption

Average power consumption by date on Xbox Series X, showing initial and improved energy-saving configurations

We highly recommend careful testing and measuring to check the impact of your changes in the live environment. We worked closely with Epic’s internal Player Support and Community teams to ensure that the changes were imperceptible to players as measured by either team’s volume reporting.

 

Conclusion

As a result of these changes, we estimate around 200 MWh per day of savings across our total player base, or 73 GWh per year (equivalent to 14 wind turbines running for a year). And importantly, we helped to reduce the energy bills of our players.

Our main finding was that it’s possible to make a significant difference to a game’s energy efficiency without a large development effort. Even small configuration changes can make a noticeable difference to overall power consumption, and significant savings are possible with some careful tuning and logic. We hope that our work will inspire other developers to make energy savings in their own games.

 

Author
Ben Woodhouse

Contributors
Nicolas Mercier
Nick Penwarden