Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Windows] Support output to HDR monitors #94496

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

DarkKilauea
Copy link
Contributor

@DarkKilauea DarkKilauea commented Jul 18, 2024

Implements: godotengine/godot-proposals#10817 for Windows.

Overview

This PR enables the ability for Godot to output to HDR capable displays on Windows. This allows Godot to output brighter images than allowed in SDR mode and with more vibrant colors.

Testing/Sample project: https://github.com/DarkKilauea/godot-hdr-output

HDR (higher bit depth image, may not display correctly on all browsers):
godot-hdr-output_hdr

SDR:
godot-hdr-output_sdr

Sponza (HDR, higher bit depth image, may not display correctly on all browsers):
godot-hdr-output_sponza_hdr

Sponza (SDR):
godot-hdr-output)sponza_sdr

Supported Platforms:

  • Windows

Supported Graphics APIs:

  • Vulkan
  • D3D12

Supported HDR Formats:

  • HDR10
  • scRGB (Linear)

Features:

  • APIs for fetching HDR display capabilities from attached displays.
  • Request a HDR capable swap chain for a window at runtime.
  • Automatic luminance matching to the display for the main window on launch.
  • Support for preferring a 16 bit per color swap chain for better blending or color banding.
  • Editor automatically updates to use HDR when changed in project settings.

Quirks:

  • Getting HDR display information on Windows requires Godot to be compiled with D3D support.

Follow up work:

  • Support Android
  • Support macOS with Metal
  • Add more error checking when requesting HDR output
  • Write official docs going over creating content for HDR displays and how to enable HDR output.
  • Create an official demo application.

Open Questions:

  • How should tonemap settings be configured to keep 3D scenes within the capabilities of the user's display?

Usage

Project Settings

  1. Enable display/window/hdr/enabled:
    image
  2. Enable rendering/viewport/hdr_2d:
    image
  3. Restart the editor as requested to enable the HDR framebuffer. Future changes to display/window/hdr/enabled will not require a restart.
  4. Adjust your tonemap settings to extend the brightness of your scene into the HDR range of your display (switching to Reinhard with a whitepoint of 0.5 and exposure of 3.0 works well). For 2D content, use colors that exceed 1.0 for a channel.

Runtime

  1. First, check that HDR is available for your platform, render driver, and display:
func _is_hdr_supported(screen: int) -> bool:
	return DisplayServer.has_feature(DisplayServer.FEATURE_HDR) \
		&& RenderingServer.get_rendering_device().has_feature(RenderingDevice.SUPPORTS_HDR_OUTPUT) \
		&& DisplayServer.screen_is_hdr_supported(screen);

func _ready() -> void:
	var screen := get_window().current_screen;
	var hdr_supported := _is_hdr_supported(screen);
	if hdr_supported:
		print("HDR is supported on this screen!");
	else:
		print("HDR is not supported on this screen.");
  1. Next, if HDR is supported, we can enable it on our current window.
if hdr_supported:
	var window := get_window();
	# Enable HDR render buffers for the current viewport, which allow for color values to exceed 1.0f.
	window.use_hdr_2d = true;
	# Request HDR output to the display.
	window.hdr_output_enabled = true;
	# Set the brightness of SDR content to match the desktop.
	window.hdr_output_reference_luminance = DisplayServer.screen_get_sdr_white_level(screen);
  1. Adjust your tonemap settings to extend the brightness of your scene into the HDR range of your display (switching to Reinhard with a whitepoint of 0.5 and exposure of 3.0 works well). For 2D content, use colors that exceed 1.0 for a channel.

Help Needed

Please give this a test, either with the linked sample project or with your own projects, and give feedback. Specifically I'm looking for input on how easy this feature was to use and if you encountered any issues with your particular display, OS, or driver configuration.

@Calinou
Copy link
Member

Calinou commented Jul 19, 2024

I gave this a quick test locally (on Windows 11 23H2 + NVIDIA 560.80 + LG C2 42"), it works as expected. This is encouraging to see, I've been wanting this for a while 🙂

I'll need to look into building more extensive scenes and getting tonemapped screenshots/videos out of this. 2D HDR also needs to be tested thoroughly.

Remember that JPEG XL or AVIF for images and AV1 for videos are a must for HDR, as other formats can only store SDR data. You may need to embed those in ZIP archives and ask users to preview them in a local media player, as GitHub doesn't allow uploading those formats and browsers often struggle displaying HDR correctly.

I noticed some issues for now:

  • Having RTX HDR enabled will mess with the HDR that is enabled in the editor. It will continuously enable and disable itself whenever you make any input in the editor (and disable itself after being idle for a second). This is also an issue on master with HDR disabled.
  • HDR Max Luminance affects both 2D (UI) and 3D rendering. Is that intended?
  • The HDR editor setting is not applied instantly when you change it, even though the demo project shows a working example of it being toggled at runtime. You can update the viewport's status based on editor settings here:
    void EditorNode::_update_from_settings() {
  • There doesn't appear to be a paperwhite setting you can use to adjust UI brightness. This is typically offered in games to prevent the UI from being too bright. Using a paperwhite value around 200 nits is common, since a lot of OLED displays cap out at that brightness level in SDR. Either way, this should be exposed in the project settings and the documentation should recommend exposing this setting to player (just like HDR peak luminance).
    • There should also be a way for unshaded materials to base themselves on paperwhite, so that Sprite3D and Label3D used for UI purposes are not overly bright in HDR. I suppose this would be a BaseMaterial3D property or a shader render mode.
      • In the interest of compatibility, we may not be able to enable this by default in Sprite3D due to VFX usage (where HDR display can be intended), but for Label3D, we may be able to safely default to this.

See the settings exposed by the Control HDR mod for an example of a best-in-class HDR implementation (related video):

control_hdr_mod_settings.mp4

Interesting, that UI seems to use the term "paperwhite" in a different way, and has a dedicated setting for the brightness of UI and HUD elements.

@DarkKilauea DarkKilauea force-pushed the rendering/hdr-output branch 2 times, most recently from 88beb60 to 8df131d Compare July 19, 2024 06:30
@DarkKilauea
Copy link
Contributor Author

I gave this a quick test locally (on Windows 11 23H2 + NVIDIA 560.80 + LG C2 42"), it works as expected. This is encouraging to see, I've been wanting this for a while 🙂

Thanks for taking a look!

I noticed some issues for now:

* Having RTX HDR enabled will mess with the HDR that is enabled in the editor. It will continuously enable and disable itself whenever you make any input in the editor (and disable itself after being idle for a second). This is also an issue on `master` with HDR disabled.

* See [[4.3 Beta 3] Strange editor brightness and colors caused by RTX Dynamic Vibrance affecting the editor #94231](https://github.com/godotengine/godot/issues/94231). We should see if we can forcibly disable RTX HDR and RTX Dynamic Vibrance for the editor using a NVIDIA profile. I haven't seen options for those in NVIDIA Profile Inspector so far.

Odd that NVidia's RTX HDR doesn't detect the HDR color space and avoid messing with the final swap chain buffer. Auto-HDR in Windows 11 appears to avoid messing with Godot when HDR is enabled. Updating the NVidia Profile may be outside the scope of this PR and be best done with a more focused PR.

* HDR Max Luminance affects both 2D (UI) and 3D rendering. Is that intended?

For the initial draft, yes, everything is mapped using the same tonemapper. However, we should map UI elements to a different brightness to avoid them being too bright. For now, that can be worked around with dimming the brightness of any UI elements via the theme, but I would like to fix that in this PR.

* The HDR editor setting is not applied instantly when you change it, even though the demo project shows a working example of it being toggled at runtime. You can update the viewport's status based on editor settings here: https://github.com/godotengine/godot/blob/ff8a2780ee777c2456ce42368e1065774c7c4c3f/editor/editor_node.cpp#L356

I haven't looked into configuring the editor to use HDR yet. Will do after I figure out how to properly tone map UI elements, if you enable HDR on the editor now, the UI is a little unpleasant.

* There doesn't appear to be a paperwhite setting you can use to adjust UI brightness. This is typically offered in games to prevent the UI from being too bright. Using a paperwhite value around 200 nits is common, since a lot of OLED displays cap out at that brightness level in SDR. Either way, this should be exposed in the project settings and the documentation should recommend exposing this setting to player (just like HDR peak luminance).

Agreed, UI elements and other 2D elements should probably be mapped to a different brightness curve. I'll probably have to figure out where in the engine 3D and 2D elements are composited together and perform the tone mapping there.

  * There should also be a way for unshaded materials to base themselves on paperwhite, so that Sprite3D and Label3D used for UI purposes are not overly bright in HDR. I suppose this would be a BaseMaterial3D property or a shader render mode.

    * In the interest of compatibility, we may not be able to enable this by default in Sprite3D due to VFX usage (where HDR display can be intended), but for Label3D, we may be able to safely default to this.

That might be outside of the scope of this PR. I'm not sure how I would indicate that certain 3D elements need to be mapped using a different brightness curve once they are all combined into the same buffer. It would be similar to trying to avoid sRGB mapping certain rendered elements.

For now, this can be worked around by decreasing the brightness of the color of these elements.

See the settings exposed by the Control HDR mod for an example of a best-in-class HDR implementation (related video):
control_hdr_mod_settings.mp4

Interesting, that UI seems to use the term "paperwhite" in a different way, and has a dedicated setting for the brightness of UI and HUD elements.

Baldur's Gate 3 and Cyberpunk 2077 also have really nice HDR settings menus. I've been basing some of this work off their approach, though modifying contrast and brightness I'm leaving up to Environment since those effects are already there.

Thanks again for your comments! I'll add some TODO items to the description for tracking.

@Jamsers
Copy link

Jamsers commented Aug 28, 2024

Can you use any Godot project to test this PR? Bistro-Demo-Tweaked and Crater-Province-Level both use physical light units, and use as close to reference values for luminosity on light sources. (i.e. the sun at noon is 100000 lux, the moon at midnight is 0.3 lux)

I'd love to help test this PR but unfortunately I don't have HDR hardware ☹️

@alvinhochun
Copy link
Contributor

I recently got a monitor that supports fake HDR DisplayHDR 400 so I thought I could give this a try, but on Intel UHD 620 it prints "WARNING: HDR output requested but no HDR compatible format was found, falling back to SDR." and doesn't display in HDR. I was kind of expected this since it is using Vulkan, but I'm a bit surprised it works for you, even on windowed mode no less. I guess there is some special handling in the NVIDIA driver?

Anyway, adding HDR output to D3D12 should be trivial and I might give it a try. (No promises!)


Shall we also consider implementing HDR display for the compatibility renderer? I am not sure if native OpenGL can do HDR, but it is very possible to implement on Windows with the help of ANGLE and some manual setting up.

@fire
Copy link
Member

fire commented Aug 28, 2024

This needs a rebase on master, but I have a https://www.dell.com/en-ca/shop/alienware-34-curved-qd-oled-gaming-monitor-aw3423dw/apd/210-bcye/monitors-monitor-accessories HDR display.

I can help test.

@DarkKilauea
Copy link
Contributor Author

Can you use any Godot project to test this PR? Bistro-Demo-Tweaked and Crater-Province-Level both use physical light units, and use as close to reference values for luminosity on light sources. (i.e. the sun at noon is 100000 lux, the moon at midnight is 0.3 lux)

I'd love to help test this PR but unfortunately I don't have HDR hardware ☹️

You should be able to test with any scene, though keep in mind that the realistic light units will not map directly to the brightness of the display. Consumer desktop displays typically don't go much above 1000 nits on the high end, which is far too dim to simulate sunlight. Values from the scene will be mapped to a range fitting within the max luminosity set for the window.

@DarkKilauea DarkKilauea force-pushed the rendering/hdr-output branch from b2bd1a1 to 728912f Compare August 29, 2024 08:49
@alvinhochun
Copy link
Contributor

Here are the changes to get Rec. 2020 HDR output on D3D12: master...alvinhochun:godot:hdr-output-d3d12

@alvinhochun
Copy link
Contributor

Quote

HDR (blown out a bit, looks better on an HDR display): image

SDR: image

The over-exposure in your screenshot is expected, but the colours are oversaturated because it is missing a colour space conversion. The colours need to be converted from BT.709 primaries to BT.2020 primaries. This is how it should look with the correct colours:

image

The conversion may be done with something like this:

diff --git a/servers/rendering/renderer_rd/shaders/color_space_inc.glsl b/servers/rendering/renderer_rd/shaders/color_space_inc.glsl
index 3583ee8365..76305a8a3c 100644
--- a/servers/rendering/renderer_rd/shaders/color_space_inc.glsl
+++ b/servers/rendering/renderer_rd/shaders/color_space_inc.glsl
@@ -19,6 +19,15 @@ vec3 linear_to_st2084(vec3 color, float max_luminance) {
        // max_luminance is the display's peak luminance in nits
        // we map it here to the native 10000 nits range of ST2084
        float adjustment = max_luminance * (1.0f / 10000.0f);
+       color = color * adjustment;
+
+       // Color transformation matrix values taken from DirectXTK, may need verification.
+    const mat3 from709to2020 = mat3(
+          0.6274040f, 0.0690970f, 0.0163916f,
+          0.3292820f, 0.9195400f, 0.0880132f,
+          0.0433136f, 0.0113612f, 0.8955950f
+       );
+       color = from709to2020 * color;

        // Apply ST2084 curve
        const float c1 = 0.8359375;
@@ -26,7 +35,7 @@ vec3 linear_to_st2084(vec3 color, float max_luminance) {
        const float c3 = 18.6875;
        const float m1 = 0.1593017578125;
        const float m2 = 78.84375;
-       vec3 cp = pow(abs(color.rgb * adjustment), vec3(m1));
+       vec3 cp = pow(abs(color.rgb), vec3(m1));

        return pow((c1 + c2 * cp) / (1 + c3 * cp), vec3(m2));
 }

@DarkKilauea DarkKilauea force-pushed the rendering/hdr-output branch from 728912f to 56d27a6 Compare August 31, 2024 02:29
@DarkKilauea DarkKilauea force-pushed the rendering/hdr-output branch from b33f414 to f49a1f6 Compare March 8, 2025 05:29
@DarkKilauea
Copy link
Contributor Author

Hey folks, I'm playing a bit of catch up here after a busy week, so I'll be attempting to summarize the points made. Please let me know if I miss anything.

@ArchercatNEO

To accomplish this it seems to me that we should default to SDR output on SDR monitors and HDR output on HDR monitors as HDR output on SDR monitors will look bad due to imperfect tonemapping and SDR output on HDR monitors will look dull. Additionally both of those cases mean there is a performance loss due to the additional superfluous transformations and should be avoided by default.

I'm wary of enabling HDR by default on games that haven't been validated to look correct on those displays. That would be a compatibility breaking change for projects that update Godot with no good way to resolve it without obtaining an HDR capable display. I think we should continue to leave HDR off by default for now and have developers opt-in to the functionality.

I hear your points on making HDR easy to implement and I believe the current defaults do that. In order to enable HDR support, you'd request HDR output for your main window, turn on hdr_2d (though I plan to remove the need to do this before merging), and validate that your game looks correct. use_screen_luminance was added specifically to allow Godot to automatically adjust HDR to the current display so an in-game calibration display is not needed, unless the developer wants to adjust the luminance range themselves or provide a mechanism in their game to do so.

@allenwp

I have not looked at any of the tonemapping yet, but for the "Linear" tonemapper, I can see that this is effective at rendering SDR content in HDR with a higher clipping ceiling.

The "linear" mode works by making absolutely no modifications to the image before passing it to the Godot RD compositor (aside from any adjustments the environment may do). The intent here was to preserve backwards compatibility for games that are using viewports for visual effects (terrain rendering, shadows, particle collision, etc), not for display to a screen. It does have the added bonus though of not compressing the dynamic range of the underlying scene and passing it directly to the display.

I noticed this bit in the compositor's _compute_reference_multiplier:

// Default to 100 nits.
return p_reference_luminance / 100.0f;

With this PR only adding HDR support for Windows, should this instead say something like //TODO: ensure this is correct for other platforms when they add support for HDR? Or is there a certainty around why 100.0 is used on all non-Windows platforms?

In my Android implementation of this PR I noticed that, at least for pixel devices, 100 nits appeared to be the system SDR white point multiplier. I have no idea if this holds for macOS, since Apple only gives you the "headroom" between the current brightness of the display and its max. So far it seems like no platform really maps to the BT.2408 recommendation, but I have no ability to test the console platforms that may fit to broadcast standards closer.

All this to say, I think max_average_luminance should be renamed to max_full_frame_luminance, which is a much more intuitive property name and also matches the source property name.

I think that makes sense. I saw that Android calls it Max Average Luminance so I decided to follow that here, but I can see how that is confusing. I've added it to my TODO list.

And on that subject, I did also notice defaulting to/favouring VK_FORMAT_A2B10G10R10_UNORM_PACK32 over VK_FORMAT_R16G16B16A16_SFLOAT, but according to this talk at around the 29 minute mark (or slide 26), it sounds like 16 bit float is preferred on windows for performance reasons.

I picked A2B10G10R10_UNORM_PACK32 mostly on the advice of the DirectX documentation: https://learn.microsoft.com/en-us/windows/win32/direct3darticles/high-dynamic-range#option-2-use-uint10rgb10-pixel-format-and-hdr10bt2100-color-space

They suggest this format for games which seemed to fit Godot. I also wanted to avoid increasing memory usage for Godot unexpectedly. I'm happy to reconsider though with empirical evidence, I'll add to my TODO list to see if I can get some performance numbers on this.

That said, I find it strange from a usability perspective that rendering/viewport/hdr_2d must be enabled for the display/window/hdr/enabled setting to actually output HDR when it is enabled. Is it possible to force rendering/viewport/hdr_2d to be enabled (and maybe greyed out in project settings) when display/window/hdr/enabled is enabled? This way the user only needs to change one project setting to enable HDR output, rather than keeping the two settings in sync with each other.

Yep, I plan to resolve this. ClayJohn updated Godot to no longer require a restart to change hdr_2d.

@ArchercatNEO, @allenwp

In the implementation of this PR, there are 4 values that together indicate if HDR should be enabled:

  1. DisplayServer.has_feature(FEATURE_HDR) indicates that the DisplayServer implementation supports HDR. It is a feature flag that doesn't have anything to do with whether the display or OS compositor supports HDR. It purpose is to indicate which platforms have HDR support written.
  2. RenderDevice.has_feature(SUPPORTS_HDR_OUTPUT) indicates if the implementation of the renderer in Godot supports HDR output. Again, a feature flag to indicate which renderers, like Metal, don't support HDR output yet.
  3. DisplayServer.screen_is_hdr_supported indicates that the screen has HDR support. This one value wraps up OS support, display support, and whether HDR is enabled in the OS. All must be true in order for Godot to output HDR for that display.
  4. DisplayServer.window_is_hdr_output_enabled indicates that the window requests that it output in HDR. Windows that do not opt-in to HDR rendering are left as SDR. This is primarily to avoid breaking games that haven't been validated for HDR and things like tooltips that really don't need the extra range.

As for Wayland, it seems like we should use prefers_hdr for the return value of DisplayServer.screen_is_hdr_supported to only output HDR when the OS compositor requests it. I want to avoid the OS compositor automatically converting between formats to avoid issues with the implementation and avoid the extra performance/memory cost if the result isn't going to be used. I know the spec says that the OS compositor /should/ correctly convert between formats, but I have absolutely no faith that every single OS compositor will. I'm not confident /any/ of them will get it right the first time. I'd like to avoid issues being created in the Godot repo due to Wayland compositor bugs if we can.

@allenwp

I've made a hacky prototype in this branch. It's a work in progress and I've broke some things in the process.

Thanks, I'll take a look at it and some of your results. I've been playing around with some ideas as well, but without much luck.

Min luminance: It might make sense to remove min luminance from Godot entirely. I can’t currently imagine where the engine or any user scripting would need this… For our purposes, using 0.0 as a minimum in calculations makes the most sense, and just let the display clip if it has a high min luminance.

I've been getting that feeling too. I was hoping it might be useful for adjusting the shadows for the lack of low-end on the display, but I'm not sure it is worth it.

(they call it HDR Paper White and it only affects the GUI)

Yep, I'm noticing a ton of games are applying the Paper White / SDR White to only the UI. Unfortunately that is impossible to do within Godot due to how the canvas renderer is interleaved with the 3D renderer, especially when the canvas background mode is used. There is also no good way to separate 2D game elements from 2D UI, both look the same to the canvas renderer.

So, I use the Paper White as a global multiplier to set color values of 1.0 across Godot to that Paper White value, then adjust the high end to make use of the rest of the range. In all reading I've done, it looks like Godot may be unique in this method. So far, it seems to work well in matching the brightness of other UI on my displays.

@ArchercatNEO

Unless we get some more vulkan extensions it seems like vulkan + windows will not support hdr. (Unless we want to risk allowing hdr output on monitors with unknown compatibilty).

Yep, DirectX must be compiled into Godot on Windows to get any HDR information about the screen. Without it, or UWP, there seems to be no way to get that information via Windows APIs. The implementation currently allows Vulkan only builds to turn on HDR even if Godot cannot determine if it is present to unblock Vulkan only builds, but its not a configuration I'd recommend.

The good news is that official builds of Godot have DirectX support compiled in, so most of the user base won't run into this issue.

Am I being overwhelming with my suggestions? Sorry if I am I just wanted to make sure the Wayland implementation is as good as it could be. Hopefully whatever the final API looks like allows for HDR content on SDR monitors on wayland and automatic SDR -> HDR on monitors on which it makes sense.

I'm only able to work a few hours a week on Godot, usually during the weekends. Please don't read my lack of reply as a lack of interest. I think we all have the same goal here of making HDR support in Godot as seamless and good looking as possible.

@ArchercatNEO
Copy link
Contributor

Well if you are busy at other times of the week there's nothing to be done about that but good to have your feedback on this discussion.

I'm wary of enabling HDR by default on games that haven't been validated to look correct on those displays.

In both of our implementations this is configured by setting display/window/hdr/enabled to true. When it is true there is automatic sdr -> hdr if the other conditions hold. If it is false even with my modifications there would be no automatic sdr -> hdr. The difference is that I would like for the setting to not enable hdr on godot windows.

fn _ready():
  # now
  assert(ProjectSettings.get_setting("display/window/hdr/enabled") == true)
  
  # stuff to make sure the window supports hdr
  var window := get_window()
  print(window.hdr_output_enabled) # true

  # proposed
  assert(ProjectSettings.get_setting("display/window/hdr/enabled") == true)

  # stuff to make sure the window supports hdr
  var window := get_window()
  print(window.hdr_output_enabled) # false

In both situations when display/window/hdr/enabled is false unless the developer enables hdr on a window manually the window will be sdr.
Difference is automatic sdr -> hdr no longer requires hdr_output on windows to be true. This is specifically for implementations where the difference between prefers_hdr and supports_hdr exists and means explicit hdr must be different to implicit automatic hdr.

I know the spec says that the OS compositor /should/ correctly convert between formats, but I have absolutely no faith that every single OS compositor will. I'm not confident /any/ of them will get it right the first time

I accept that this is a reasonable concern. I also admit that I do not have the experience to determine if it is a justified concern. Therefore I will defer to @Riteo and @Zamundaaa, is this a justified concern? hdr on sdr monitors working on kwin is not enough, can we be confident all compositors will have a sufficiently robust implementation? The reason I am asking here and not in the wayland thread is that for us to allow hdr on sdr monitors and automatic sdr -> hdr it requires changes to this api. Otherwise it seems like our only option is to do automatic (or explicit) hdr on monitors which prefer an hdr colorspace.

use_screen_luminance

Here is probably where my inexperience of hdr game developement is most prominent. If a developer has an hdr monitor and calibrates the game to look good on such a monitor (through project settings lets say) there is no guarantee a user has the same monitor and the game will look the same on the user's monitor. Therefore I believe screen_luminance should always take priority over static values like project settings and the bt.2408 recommendations. If we do that then we can just use screen_luminance as fallback values (or prefered values)

bt.2408 < project luminance < system luminance < explicit luminance

If a developer has explicitly enabled hdr on a window (through scripting not through the project setting) then explicit luminance values are used and if missing fallback to system luminance values. If implicit hdr has been enabled (the project setting) explicit luminance values are ignored and system luminance is preferred, falling back to project luminance values if the system cannot provide them.

This has many of the properties of use_screen_luminance but with more control. If you want to use explicit values this is still supported. If you want system values this is also supported, just don't set explicit values. If you want some system values this is possible here unlike in the use_screen_luminance case. Unless there is more value in project luminance than I currently suspect this seems like a decent fallback chain and allows for system luminance on systems which don't provide all necessary values.

@Jamsers
Copy link

Jamsers commented Mar 8, 2025

I accept that this is a reasonable concern. I also admit that I do not have the experience to determine if it is a justified concern. Therefore I will defer to @Riteo and @Zamundaaa, is this a justified concern?

This is a justified concern because for gaming especially, ideally we adhere to HGiG guidelines well. If HGiG is set up properly, it basically means the OS shouldn't touch the image, the GPU shouldn't touch the image, the TV/display shouldn't touch the image: the game engine should have full authority and final say over output, and assumes responsibility for toneppping towards the display's specific max nits and color space.

If a developer has an hdr monitor and calibrates the game to look good on such a monitor (through project settings lets say) there is no guarantee a user has the same monitor and the game will look the same on the user's monitor. Therefore I believe screen_luminance should always take priority over static values like project settings and the bt.2408 recommendations.

HDR grading for games doesn't work that way, values aren't static. You can try this out in games that have HDR like Cyberpunk - the max nits the game tonemaps for is a changable value that the user can set in the game options. So is "paper white" or "midpoint" which is basically the average brightness the tonemapper tries to expose for, and is usually also the brightness of UI elements. And so is the color space, which in practice is pretty much just choosing between Dolby Vision and HDR10/10+.

Even in games that don't provide HDR options, the engine will (or should) try to autodetect the display's max nits and adjust, internally, the tonemapper's max nits, and the display's colorspace and adjust, internally, the engine's colorspace.

@ArchercatNEO
Copy link
Contributor

ArchercatNEO commented Mar 8, 2025 via email

@allenwp
Copy link
Contributor

allenwp commented Mar 8, 2025

Have you spent much time experimenting with this PR and other modern games with HDR support on Windows 10 or Windows 11 yet @ArchercatNEO?

@ArchercatNEO
Copy link
Contributor

I must admit my experience with hdr before trying to implement it in wayland was 0. I haven't actually tested either PR, played an hdr-capable game or have an hdr monitor. I just keep hearing about people wanting hdr in wayland and that godot hadn't implemented hdr yet so when the protocol was finally merged I decided I could try my hand at implementing it. My perspective on how developers/users should do hdr then is ideally they shouldn't, instead godot and the system compositor should have a solid enough implementation that developer effort and user calibration aren't required. That doesn't mean I think hdr settings menus need to be eliminated, user configuration should always take priority over what we can do at the engine level but I'd like to have an implementation where it isn't required for godot games to look good on hdr monitors.

@allenwp
Copy link
Contributor

allenwp commented Mar 8, 2025

Ok, the reason I asked is that I think it would be most helpful to hear your thoughts in the context of specific behaviours that you have experienced when trying out this PR on Windows, but if you don't have the hardware on hand to test this, then it sounds like this might not be an option.

@DarkKilauea
Copy link
Contributor Author

When screen luminance is enabled the hdr api is mostly disabled (just raises an error an returns) but I don’t see an actual fallback for missing values

The warnings are mostly there to let the developer know that setting luminance values has no effect if they've delegated handling luminance to Godot.

The choice has been distilled down to:

  1. Let Godot handle luminance for you, automatically configuring the tonemapper to expand the range to the user's display. This is enabled by default.
  2. Take control yourself, provide the luminance range and all the responsibility that entails to keep it updated as system config changes or the window moves around. This option enables "mastering" scenarios where the developer does not want Godot messing with the image on different displays.

I'm not understanding why this is incomplete. What scenarios are not supported?

@DarkKilauea
Copy link
Contributor Author

In both situations when display/window/hdr/enabled is false unless the developer enables hdr on a window manually the window will be sdr.
Difference is automatic sdr -> hdr no longer requires hdr_output on windows to be true. This is specifically for implementations where the difference between prefers_hdr and supports_hdr exists and means explicit hdr must be different to implicit automatic hdr.

Oh, I think I see a source of confusion here. The project settings only affect the main window and some viewports in the Editor to enable seamless authoring. It does not control "HDR" globally or enable it to be used. The developer can always ignore the project settings and go enable HDR on their main window and it will work the same way in game.

Like many project settings in Godot, it is primarily there as a convenience for getting the most common scenarios setup.

@ArchercatNEO
Copy link
Contributor

It does not control "HDR" globally or enable it to be used

When I read main.cpp I saw a GLOBAL_GET for that setting which if true, enabled hdr on the main window. Then at some point that created windows inherit their hdr status from the main window. Did I misunderstand what the code change in main.cpp does? If the project setting doesn't actually enable automatic hdr for the actual exported game then I am suggesting that it do that.

@DarkKilauea DarkKilauea force-pushed the rendering/hdr-output branch from f49a1f6 to d68a6be Compare March 8, 2025 19:39
@ArchercatNEO
Copy link
Contributor

I'm not understanding why this is incomplete. What scenarios are not supported?

Yeah maybe that was probably me thinking too fast. There is the footgun of disabling screen_luminance without setting values defaulting to project settings but that's a strech. Essentially the only things my suggestion would add is being a bit simpler (in my opinion) as it allows to remove the system_luminance project setting while still keeping the benefits. Also the case where only some of the settings are set by the developer and others are set by the system, I admit this seems a bit strange of a usecase. Mainly it's doing the same thing but in a way that seemed a bit simpler to me. If it doesn't seem simpler to you then we could probably just drop the change.

@DarkKilauea DarkKilauea force-pushed the rendering/hdr-output branch from d68a6be to fa2fa6d Compare March 8, 2025 22:45
@DarkKilauea
Copy link
Contributor Author

A couple quick updates:

  1. Changed default for reference_luminance to 200 as suggested by allenwp
  2. Renamed screen_get_max_average_luminance to screen_get_max_full_frame_luminance for clarity.

@Jamsers
Copy link

Jamsers commented Mar 9, 2025

My perspective on how developers/users should do hdr then is ideally they shouldn't, instead godot and the system compositor should have a solid enough implementation that developer effort and user calibration aren't required.

This is misguided - unfortunately HDR isn't just some feature that you can flip on to make your SDR game prettier - it's a different presentation that fundamentally requires developer attention. You can literally see this in @allenwp's updates: you can't just take ACES/AgX SDR tonemapping output and map it to HDR - it fundamentally requires a different tonemapper. This and many others is a big reason why even AAA studios often get their HDR wrong.

In addition, proper game HDR usually requires the compositor (along with the display itself) to butt out and leave the game engine alone. Any interference tends to cause issues. (especially since the HDR stack is still quite brittle even nowadays)

Trying to make it so that users don't have to configure HDR is a good goal - but even then, that's achieved by making sure the engine can reliably get correct information about the display's colorspace and max nits - so that the developer can then implement functionality to take care of HDR configuration for the user.

You can't divorce the developer's hand from HDR - they're fundamentally going to have to get their hands dirty to implement HDR - so the engine's responsibility is really just to provide the developer with the tools needed for them to do their job.

What you seem to want is an auto-magic SDR to HDR feature. The good news is that does exist. Windows offers it as Auto-HDR, NVIDIA offers it as RTX HDR. However... if Wayland/Linux wants this, it's on them to implement it on their side. This is actually a compositor/driver side feature, not a game engine feature.

You also seem to want an auto-magic HDR to SDR feature. As I mentioned before, no one has implemented this for games because it doesn't make much sense - if the engine knows it's outputting to an SDR screen, why wouldn't it just use its SDR pipeline? The output will be much higher quality than a converted HDR to SDR image. That being said, if for whatever reason you still really want this, this can also be implemented as a compositor/driver side feature. The only existing implementations to reference are in video players like VLC, since auto-magic HDR to SDR is only useful for media with static, baked data, that won't necessarily have SDR data, like video and images.

@ArchercatNEO
Copy link
Contributor

ArchercatNEO commented Mar 9, 2025

This is actually a compositor/driver side feature, not a game engine feature.

I would have an easier time accepting this if the automatic sdr -> hdr features didn't exist. auto hdr and rtx hdr are compositor side yes but why wouldn't we be able to do precisely what they do ourselves? Also what about use_screen_luminance? That is explicitly about releasing control of hdr to the engine so that it may do sdr -> hdr. Do you propose we remove this feature because for good hdr the developer must have been involved?

You also seem to want an auto-magic HDR to SDR feature.

There are 2 ways I could interpret this. Godot renders something in hdr and then tonemaps to sdr internally, which we already do. When importing hdr image formats I believe godot does just clip the hdr to an sdr range.

I could also interpret it as we send hdr to the compositor and expect the compositor to be able to convert it to sdr for the monitor. This is precisely a compositor feature like you said and the spec for wayland says that we may depend on the compositor to have this feature. We wouldn't automatically do hdr output on sdr monitors because that's not efficient but wayland has this feature so I don't see why we should actively prevent the developer from doing it on wayland.

Automatic hdr wouldn't do it on sdr monitors and we can just add documentation saying that explicit hdr (rather than the implicit, automatic hdr) should be exposed as settings to the user at which point it's up to the user to decide if hdr happens on their sdr monitor or not. If Windows doesn't have this feature, then it doesn't and we aren't required to allow it.

From the original proposal godotengine/godot-proposals#10817

While Godot internally renders in HDR when using the Forward+ and Mobile rendering methods, it currently does not support outputting to an HDR display in a way that preserves the dynamic range.

The way I interpret this is that godot has been doing hdr rendering for a long time and has been tonmapping hdr to sdr for a long time. Is this the automagic hdr -> sdr you were talking about? Maybe the rendering pipeline changed since the proposal was made?

@Jamsers
Copy link

Jamsers commented Mar 9, 2025

auto hdr and rtx hdr are compositor side yes but why wouldn't we be able to do precisely what they do ourselves?

Well actually yes, we could do it. Could be a nice feature for devs that don't want to do the legwork for proper HDR.

I will say that the fundamentals of Godot's HDR will have to be finished and released first because this feature will require that to even work in the first place. But after all's said and done, we could create a separate proposal/PR to implement this.

Also what about use_screen_luminance? That is explicitly about releasing control of hdr to the engine so that it may do sdr -> hdr. Do you propose we remove this feature because for good hdr the developer must have been involved?

There are 2 ways I could interpret this. Godot renders something in hdr and then tonemaps to sdr internally, which we already do. When importing hdr image formats I believe godot does just clip the hdr to an sdr range.

The way I interpret this is that godot has been doing hdr rendering for a long time and has been tonmapping hdr to sdr for a long time. Is this the automagic hdr -> sdr you were talking about? Maybe the rendering pipeline changed since the proposal was made?

I think there's some confusion here... I apologize. This is a common pitfall when talking about HDR. Let me clarify.

There is "HDR" in the "Half Life 2 Lost Coast" sense - this is talking about how games, internally, represent the scene with HDR data, so that they can do fancy stuff (at least for the 2000s) like bloom and eye adaptation. We're not talking about that, although it is related. Regardless of whether the engine supports and outputs HDR or not, internally, the engine represents the scene in "HDR" either way, pre-tonemapping. This is how it's been since the 2000s.

When we talk about HDR in the here and now, we're talking about HDR displays, and the pipeline from game output to display. The HDR10 spec basically. (And potentially HDR10+ and Dolby Vision)

So when I refer to an "auto-magic SDR to HDR feature" like Windows Auto-HDR, what that does is, the compositor takes the SDR output of a game that has no HDR capabilities, or has explicitly turned off its HDR capabilities, and does realtime inverse tone mapping on that output. The compositor then takes the result of that and presents it to the display as HDR data.

When I refer to an "auto-magic HDR to SDR feature", this is taking HDR data and tonemapping that down to SDR in realtime, then presenting that SDR data to the display. This can currently only be found on video players (AFAIK), but theoretically how it could work with games is, let's say we're on a system that doesn't have HDR capabilities. If we have a game that has HDR capabilities, the compositor can "lie" to the game and say the system has HDR capabilities. The game then sends out HDR output to the compositor. The compositor will then take that HDR output, tonemap it down to SDR in realtime, and present that SDR data to the display.

I apologize if it's still not clear after this, I fully admit HDR is honestly a mess and confusing as hell, and developers are only now starting to grok all of it. There's a reason why a whole decade after the standard was introduced, only now are game developers starting to embrace it.

@Jamsers
Copy link

Jamsers commented Mar 9, 2025

So, on a game that doesn't support HDR:

In engine:

  1. scene is rendered in "HDR"
  2. render is tonemapped to SDR
  3. output to compositor

In compositor:

  1. compositor receives SDR output
  2. compositor outputs SDR data to display

If HDR display is connected and HDR is enabled for the OS:

  1. compositor receives SDR output
  2. compositor composites this SDR output within the HDR desktop/buffer
  3. compositor outputs HDR data to display

(You can just composite SDR images into an HDR buffer with no tonemapping or conversion required, because SDR is a subset of HDR)

On a game that does support HDR:

In engine:

Engine asks compositor/OS if system supports HDR.

If compositor answers no, engine will just do the SDR pipeline.

If compositor answers yes:

  1. scene is rendered in "HDR"
  2. render is tonemapped to HDR.
  3. output to compositor

In compositor:

  1. compositor receives HDR output
  2. compositor outputs HDR data to display

@Jamsers
Copy link

Jamsers commented Mar 9, 2025

This is where our current work lies:

If compositor answers yes:

  1. scene is rendered in "HDR"
  2. render is tonemapped to HDR.

"But wait a minute, this doesn't make sense! It's already in HDR, so why does it need to tonemap to HDR?"

Ah, therein lies the rub. The "HDR" representation that engines use internally is actually higher range than current HDR display standards! In addition, as I'm sure you know, HDR displays range from 400 nits to 1000+ nits. Tonemapping is still required, and this is where values like use_screen_luminance are used. When the engine asks the compositor/OS if the system supports HDR, if it's answered yes, the engine will also ask for the display's max nits, and the display's color space. The tonemapper will then tonemap specifically for the max nits and color space reported by the system, not to a generic, fixed HDR target like movies or images do.

"Then what's the point of the HDR options offered in the game's settings menu?"

User control over the tonemapper behavior, basically. For example, maybe, even if your TV supports 1000 nits, you only want to use 400 nits of brightness range. Maybe the TV is lying to the system about its actual capabilities (some older cheapo TVs did this). Maybe you prefer the TV's tonemapper - for example, you disable HGiG mode, set the game to output full 1000 nits even if the TV only supports 600 nits, and leave it to the TV to tonemap that down as it sees fit. Maybe you want to choose between HDR10+ and Dolby Vision.

"Where would the auto-magic SDR to HDR feature lie in all this, if we were to implement it?"

In engine:

Engine asks compositor/OS if system supports HDR.

If compositor answers no, engine will just do the SDR pipeline.

If compositor answers yes, and developer enabled the auto-magic SDR to HDR feature:

  1. scene is rendered in "HDR"
  2. render is tonemapped to SDR
  3. SDR render is inverse tonemapped to HDR
  4. output to compositor

In compositor:

  1. compositor receives HDR output
  2. compositor outputs HDR data to display

@Zamundaaa
Copy link

I know the spec says that the OS compositor /should/ correctly convert between formats, but I have absolutely no faith that every single OS compositor will. I'm not confident /any/ of them will get it right the first time

On HDR displays, Windows 11 still gets sRGB wrong, and many HDR displays are terrible themselves, so I understand your hesitation, but things work differently on Wayland. Colorspaces are very strictly defined, and conversions between them are simple mathematical formulae that so far not a single compositor has gotten wrong.

Preventing conversions is a good thing in general, targeting the compositor's preferred colorspace is absolutely the right move for performance reasons, but assuming that the entire rest of the stack is completely broken and that you need to attempt to work around it before even seeing a single actual issue is a terrible approach.

You won't even always have a choice - in many cases, KDE Plasma will tell you to target a colorspace that uses the native primaries and whitepoint of the display + a gamma 2.2 transfer function with HDR headroom, rather than some standard HDR colorspace. Sometimes this is for compositor-side optimizations, but sometimes it's also just how the display works: The Steam Deck OLED for example uses a conventional gamma 2.2 transfer function.

Unless you add support for that, using Wayland instead of Vulkan to tell the compositor that you're targeting this special color space (which would be cool tbf, and could prevent color conversions on SDR displays as well), you're just gonna have to trust the compositor to convert whatever colorspace you use to what it actually needs.

hdr on sdr monitors working on kwin is not enough, can we be confident all compositors will have a sufficiently robust implementation?

I think the above answers that question, but to expand on that, I would recommend you to detect if you should prefer HDR by checking max_luminance > reference_luminance. If the compositor can make HDR work nicely on a given screen, no matter how that works under the hood, it'll just tell you by sending matching luminance values in the preferred image description.

Allowing the user or developer to manually enable HDR even when that's not the case might be nice to test the compositor's tone mapping capabilities, but isn't something you really need to care about.

Yes we should prefer dynamic metadata to static metadata when setting hdr luminances. Dynamic metadata being system/screen luminance and hdr api. Static metadata being the project settings and bt.2408 recommendations.

That's not right. Static metadata - or more specifically the part of it that's called the mastering display information - describes the display the content was made for.

On Wayland, the preferred image description tells you about the display to target, so assuming you tonemap for that, you should also use those same values as the static HDR metadata, which tells the compositor that you're targeting the display you're supposed to, and that it shouldn't do any tone mapping.

If you don't target the system provided values, then the luminance ranges the developer manually set up should be used - representing the display they optimized the game for - and the compositor will tonemap it to the display it actually gets presented on.

Dynamic metadata would give the compositor more information about the actually used brightness ranges in a given scene, which may be lower than what the target display is capable of. As this is for better tonemapping, which you want to avoid in most cases anyways, and Wayland doesn't even have an extension for dynamic HDR metadata yet, you really don't have to worry about it right now.

The "HDR" representation that engines use internally is actually higher range than current HDR display standards!

I think it's more helpful there to talk about the meaning of the brightness values rather than just their range. The internal data is normally scene-referred, it represents the brightness that things would have in the real world (assuming a "realistic" scene), and the output data takes into account the light adaptation of the viewer, and as you said of course the capabilities of the display.

@DarkKilauea DarkKilauea force-pushed the rendering/hdr-output branch from fa2fa6d to b5a3369 Compare March 10, 2025 03:19
@DarkKilauea
Copy link
Contributor Author

Applied some of the tonemapping changes that @allenwp was playing around with since they seem to improve the result significantly.

Also gave me a chance to refactor the tonemap shader to remove parameters I was no longer using.

@allenwp
Copy link
Contributor

allenwp commented Mar 10, 2025

Applied some of the tonemapping changes that @allenwp was playing around with since they seem to improve the result significantly.

Also gave me a chance to refactor the tonemap shader to remove parameters I was no longer using.

Thanks! I wanted to adapt your glow changes to this approach to review how it would behave with the different glow modes and a "Linear" tonemapper. Sometimes glow is used to simulate a physical effect (fog glow around a bright point of light), other times it is used as a psychological/perception trick to make part of an image appear brighter than the display can reproduce, or it can even be used as a stylistic effect with no physical or psychological basis.

I will reiterate that the intent behind this approach is to demonstrate that it is not possible to adapt any of the existing filmic tonemapper equations to HDR, as the contrast curve is not correctly scaled. Because Reinhard does not have a "toe", it is reasonable to adapt it to HDR using this approach, but it breaks when white < (max_luminance / ref_luminance), so it likely makes the most sense to simply create a new HDR Reinhard tonemapper that has a white = max(white, (max_luminance / ref_luminance); statement, alongside the new filmic HDR tonemappers when they are introduced.

(A good example of when this breaks is setting Reinhard white to 1.0, which by definition should be identical to Linear. Another example would be setting white to 2.0, which is a reasonable value if you are targeting the Mobile rendering method: you will notice that the contrast and brightness between HDR and SDR is very different on all of the tonemappers. With higher white values, the difference in contrast is more subtle, but still incorrect in HDR mode.)

@ArchercatNEO
Copy link
Contributor

Allowing the user or developer to manually enable HDR even when that's not the case might be nice to test the compositor's tone mapping capabilities, but isn't something you really need to care about.

Very well. I still believe separating "explicit" hdr (the actual scripting hdr api) and "implicit" hdr (the project setting) would be more flexible (and allow us to distinguish between monitors which both support and prefer hdr from those which support hdr but do not prefer it, likely because of a compositor feature) but since this is my primary usecase if we decide we really don't want this and nobody has a different usecase for making the distinction then my suggestions can be ignored.

That's not right. Static metadata - or more specifically the part of it that's called the mastering display information - describes the display the content was made for.

Yeah my bad. I shouldn't have used "static metadata" and "dynamic metadata" here since they already have standard definitions in this context. I was using dynamic as runtime luminances, ie luminances we got while the game was running. The luminance values in this category then were the "system/screen" luminances which for most situations should be the ones we prefer and values decided either in a settings menu or from something like a colorspace picker in an artistic app. For the settings menu the loss of performance seems fine to me since the user explicitly enabled the setting and the artistic app needs that kind of control for mastering and such. Really my version of "dynamic" luminances just meant the system luminances in most cases and luminances we got from some other source not decided by the developer, the user, an image's hdr metadata. I also said "static metadata" because it didn't change at runtime which was also a bad move. It's just the bt.2408 luminances and the project settings. Although thinking about it maybe we could provide mastering display information somewhere, maybe from the project settings themselves even. But I won't push too much on that, if it turns out mastering display metadata is useful we could possibly return in a later pr.

@ArchercatNEO
Copy link
Contributor

Well I have some code examples of what my wayland impl is, what I'd like it to be and why it can't be implemented like that with the current state of this pr. Maybe it will make the usecase clearer maybe it won't.

Current

bool DisplayServerWayland::screen_is_hdr_supported(int p_screen) const {
  ...
  // as per zamundaaa's suggestion
  return wayland_thread.supports_hdr() && (screen->color_profile.max_luminance > screen->color_profile.sdr_white);
}

void DisplayServerWayland::_window_update_hdr_state(WindowID p_window) {
     ...
    bool hdr_preferred = window->preferred_color_profile.max_luminance > window->preferred_color_profile.sdr_white;
    bool hdr_enabled = rendering_context->window_get_hdr_output_enabled(p_window);
    bool hdr_desired = wayland_thread.supports_hdr() && hdr_preferred && hdr_enabled;
}

(the hdr state is just hdr_desired)

The thing I would like to draw your attention is how to actually enable hdr on a window we check if a window's preferred profile meets the max_luminance > sdr_white criteria but for screen_is_hdr_supported we check if the screen's profile meets the criteria. These are not guaranteed to be the same (and I don't know how on how many compositors they might be). In theory they should be pretty close which is why I decided we could be slightly inconsistent here. Big problem is that if they actually differ we could have something like a broken hdr settings menu that's very hard to track down.

What I would like to use

bool DisplayServerWayland::screen_is_hdr_supported(int p_screen) const {
  ...
  // as per the wayland spec
  return wayland_thread.supports_hdr();
}

void DisplayServerWayland::_window_update_hdr_state(WindowID p_window) {
     ...
    bool system_hdr = GLOBAL_GET("display/window/hdr/enabled");
    bool hdr_preferred = window->preferred_color_profile.max_luminance > window->preferred_color_profile.sdr_white;
    bool hdr_enabled = rendering_context->window_get_hdr_output_enabled(p_window);
   
    if (system_hdr && hdr_preferred && wayland_thread.supports_hdr()) {
       //use screen luminances
    } else if (hdr_enabled && wayland_thread.supports_hdr()) {
       //use developer-set luminances
    } else {
       //disable hdr
    }
}

Why can I not do this? It's because the current implementation makes all windows request hdr turning most of this into just if (true) { hdr } which clearly is not efficient. If it did work it would solve the consistency problem. It is the developer using screen_supports_hdr and setting hdr_enabled so it's good that the conditions are exactly the same. The other benefit is we can also automatically enable hdr on certain monitors like we already do. This is why I would like for display/window/hdr/enabled to not call window_set_hdr_output_enabled at startup. That change means this implementation is possible.

There are 2 other alternatives for the consistency problem:

  • Use the main window's preferred profile inside screen_supports_hdr. This solves inconsistency when we just check if the screen the main window is on is the only one we care about but isn't really what we should be doing.
  • Use the screen profile in _window_update_hdr_state. This one I'm also very against. When using "screen" luminances the wayland implementation uses the window preferred profile not the screen profile. Checking against screen luminance and using preferred luminance seems even more inconsistent a behavior.

If I still haven't convinced anyone chances are the inconsistency probably won't be a problem. I cannot guarantee that it won't be a problem but I don't predict many compositors making the preferred profile really different to the screen profile very often (though now that I think a bit more, compositors can put windows in several screens which could make this quite a bit more complicated).

@allenwp
Copy link
Contributor

allenwp commented Mar 13, 2025

Regarding ACES 2 support: I had a discussion with one of the members of ACES about how things should be handled with an operating system that scales its SDR content when displaying it in an HDR signal, instead of pinning SDR content at exactly 100 nits.

https://community.acescentral.com/t/aces-2-in-game-engines-variable-reference-max-luminance/5734

The summary is: there is no recommendation on how to deal with this and it would need to be researched. Regardless, I think I came up with a way to best handle this by using max_luminance and reference_luminance values, but it is yet to be tested. I first need to write an ACES 2 reference implementation in glsl that could accept these parameters and then figure out how to simplify and approximate it.

I could say that a truly correct ACES 2 implementation would force reference_luminance to 100, but I believe this has a notable downside of creating an game that has a very different brightness between SDR and HDR output.

I could go even further to say that Godot should always force reference_luminance to 100, full stop, and this would allow ACES 2 to be absolutely correct in HDR and more obviously incorrect in SDR (because SDR would appear brighter on most systems). I don't think I agree with this; it's better to just let the developer/player set their reference_luminance to 100 if they want ACES HDR to be absolutely "correct". We simply don't have the ability to make ACES SDR game output correct on Windows when in Windows is in HDR output mode.

@DarkKilauea DarkKilauea force-pushed the rendering/hdr-output branch from b5a3369 to 137b53c Compare March 15, 2025 22:24
Co-authored-by: Alvin Wong <alvinhochun@gmail.com>
@DarkKilauea DarkKilauea force-pushed the rendering/hdr-output branch from 137b53c to e112064 Compare March 21, 2025 03:07
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.