Temporal Brightness Management for Immersive Content
1Università della Svizzera italiana, Switzerland
Abstract:
Modern virtual reality headsets demand significant computational resources to render high-resolution content in real-time. Therefore, prioritizing power efficiency becomes crucial, particularly for portable versions reliant on batteries. A significant portion of the energy consumed by these systems is attributed to their displays. Dimming the screen can save a considerable amount of energy; however, it may also result in a loss of visible details and contrast in the displayed content. While contrast may be partially restored by applying post-processing contrast enhancement steps, our work is orthogonal to these approaches, and focuses on optimal temporal modulation of screen brightness. We propose a technique that modulates brightness over time while minimizing the potential loss of visible details and avoiding noticeable temporal instability. Given a predetermined power budget and a video sequence, we achieve this by measuring contrast loss through band decomposition of the luminance image and optimizing the brightness level of each frame offline to ensure uniform temporal contrast loss. We evaluate our method through a series of subjective experiments and an ablation study, on a variety of content. We showcase its power-saving capabilities in practice using a built-in hardware proxy. Finally, we present an online version of our approach which further emphasizes the potential for low level vision models to be leveraged in power saving settings to preserve content quality.
Video:
Download:
Supplementary Material (6.0 MB)
Interactive Demo
Please download the package, unzip it, and execute BrightnessManagement.exe.
This version, based on the PID controller explained in Section 6 of the paper, is a POC showing that a fast implementation of the method is feasible. With better calibration of the PID parameters, it might lead to an effective real-time content-dependent brightness control. The scene spans diverse lighting conditions, from dark interiors to bright exteriors. Feel free to explore it to get a sense of the brightness behaviour of our technique. The brightness factor [0-1] is dumped every 10s inside BrightnessManagement_Data/brightness.txt, together with the corresponding timestamps. With the original scene and the log of the brightness, you could compare it with your technique by recording your gameplay session. To test on specific video, use the offline version.
Important: the perceptual quality has not been validated for this real-time version, as explained in the limitations of the paper. Indeed, it might exhibit flickering artifacts, particularly when looking at very bright regions (e.g. the sky at the exterior of the house).
Our technique, in a less constrained fashion, could also be used where temporal consistency is not critical, e.g. when using the operating system for everyday tasks. A similar approach is indeed implemented in Windows, which, in addition to adjusting brightness and contrast, also changes colors based to the content being displayed.
These are the FPS from testing with different workstations / GPUs.
Nvidia RTX 2080 at FHD: ~60 FPS
Nvidia RTX 3090 at 4K: ~40 FPS
Nvidia GTX 960M at UWQHD: ~7 FPS
Shared GPU at WQXGA+: ~5 FPS
Web Demo
This version is similar to the Windows version. The textures have been heavily compressed to produce a web build of reasonable size. In addition, since ComputeShaders are not supported on WebGL, the final average pooling step of the method is performed on the CPU. Every frame must first be transferred to the CPU before pooling can take place. Consequently, this version is very slow and is intended only as a quick look at the general idea of the technique, without requiring a download. For reliable behavior, please refer to the Windows version above.
Citation:
@article{surace2025temporal,
title={Temporal Brightness Management for Immersive Content},
author={Surace, Luca and Condor, Jorge and Didyk, Piotr},
publisher = {The Eurographics Association},
year={2025},
DOI = {10.2312/sr.20251183}
}
Acknowledgements:
This project has received funding from the European Research Council (ERC) under the European Union's Horizon 2020 research and innovation program (grant agreement N° 804226 PERDY).
