-
-
Notifications
You must be signed in to change notification settings - Fork 22k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Windows] Support output to HDR monitors #94496
base: master
Are you sure you want to change the base?
Conversation
I gave this a quick test locally (on Windows 11 23H2 + NVIDIA 560.80 + LG C2 42"), it works as expected. This is encouraging to see, I've been wanting this for a while 🙂 I'll need to look into building more extensive scenes and getting tonemapped screenshots/videos out of this. 2D HDR also needs to be tested thoroughly. Remember that JPEG XL or AVIF for images and AV1 for videos are a must for HDR, as other formats can only store SDR data. You may need to embed those in ZIP archives and ask users to preview them in a local media player, as GitHub doesn't allow uploading those formats and browsers often struggle displaying HDR correctly. I noticed some issues for now:
See the settings exposed by the Control HDR mod for an example of a best-in-class HDR implementation (related video): control_hdr_mod_settings.mp4Interesting, that UI seems to use the term "paperwhite" in a different way, and has a dedicated setting for the brightness of UI and HUD elements. |
88beb60
to
8df131d
Compare
Thanks for taking a look!
Odd that NVidia's RTX HDR doesn't detect the HDR color space and avoid messing with the final swap chain buffer. Auto-HDR in Windows 11 appears to avoid messing with Godot when HDR is enabled. Updating the NVidia Profile may be outside the scope of this PR and be best done with a more focused PR.
For the initial draft, yes, everything is mapped using the same tonemapper. However, we should map UI elements to a different brightness to avoid them being too bright. For now, that can be worked around with dimming the brightness of any UI elements via the theme, but I would like to fix that in this PR.
I haven't looked into configuring the editor to use HDR yet. Will do after I figure out how to properly tone map UI elements, if you enable HDR on the editor now, the UI is a little unpleasant.
Agreed, UI elements and other 2D elements should probably be mapped to a different brightness curve. I'll probably have to figure out where in the engine 3D and 2D elements are composited together and perform the tone mapping there.
That might be outside of the scope of this PR. I'm not sure how I would indicate that certain 3D elements need to be mapped using a different brightness curve once they are all combined into the same buffer. It would be similar to trying to avoid sRGB mapping certain rendered elements. For now, this can be worked around by decreasing the brightness of the color of these elements.
Baldur's Gate 3 and Cyberpunk 2077 also have really nice HDR settings menus. I've been basing some of this work off their approach, though modifying contrast and brightness I'm leaving up to Environment since those effects are already there. Thanks again for your comments! I'll add some TODO items to the description for tracking. |
b89985a
to
e9742ba
Compare
e9742ba
to
b2bd1a1
Compare
Can you use any Godot project to test this PR? Bistro-Demo-Tweaked and Crater-Province-Level both use physical light units, and use as close to reference values for luminosity on light sources. (i.e. the sun at noon is 100000 lux, the moon at midnight is 0.3 lux) I'd love to help test this PR but unfortunately I don't have HDR hardware |
I recently got a monitor that supports Anyway, adding HDR output to D3D12 should be trivial and I might give it a try. (No promises!) Shall we also consider implementing HDR display for the compatibility renderer? I am not sure if native OpenGL can do HDR, but it is very possible to implement on Windows with the help of ANGLE and some manual setting up. |
This needs a rebase on master, but I have a https://www.dell.com/en-ca/shop/alienware-34-curved-qd-oled-gaming-monitor-aw3423dw/apd/210-bcye/monitors-monitor-accessories HDR display. I can help test. |
You should be able to test with any scene, though keep in mind that the realistic light units will not map directly to the brightness of the display. Consumer desktop displays typically don't go much above 1000 nits on the high end, which is far too dim to simulate sunlight. Values from the scene will be mapped to a range fitting within the max luminosity set for the window. |
b2bd1a1
to
728912f
Compare
Here are the changes to get Rec. 2020 HDR output on D3D12: master...alvinhochun:godot:hdr-output-d3d12 |
The over-exposure in your screenshot is expected, but the colours are oversaturated because it is missing a colour space conversion. The colours need to be converted from BT.709 primaries to BT.2020 primaries. This is how it should look with the correct colours: The conversion may be done with something like this: diff --git a/servers/rendering/renderer_rd/shaders/color_space_inc.glsl b/servers/rendering/renderer_rd/shaders/color_space_inc.glsl
index 3583ee8365..76305a8a3c 100644
--- a/servers/rendering/renderer_rd/shaders/color_space_inc.glsl
+++ b/servers/rendering/renderer_rd/shaders/color_space_inc.glsl
@@ -19,6 +19,15 @@ vec3 linear_to_st2084(vec3 color, float max_luminance) {
// max_luminance is the display's peak luminance in nits
// we map it here to the native 10000 nits range of ST2084
float adjustment = max_luminance * (1.0f / 10000.0f);
+ color = color * adjustment;
+
+ // Color transformation matrix values taken from DirectXTK, may need verification.
+ const mat3 from709to2020 = mat3(
+ 0.6274040f, 0.0690970f, 0.0163916f,
+ 0.3292820f, 0.9195400f, 0.0880132f,
+ 0.0433136f, 0.0113612f, 0.8955950f
+ );
+ color = from709to2020 * color;
// Apply ST2084 curve
const float c1 = 0.8359375;
@@ -26,7 +35,7 @@ vec3 linear_to_st2084(vec3 color, float max_luminance) {
const float c3 = 18.6875;
const float m1 = 0.1593017578125;
const float m2 = 78.84375;
- vec3 cp = pow(abs(color.rgb * adjustment), vec3(m1));
+ vec3 cp = pow(abs(color.rgb), vec3(m1));
return pow((c1 + c2 * cp) / (1 + c3 * cp), vec3(m2));
}
|
728912f
to
56d27a6
Compare
b33f414
to
f49a1f6
Compare
Hey folks, I'm playing a bit of catch up here after a busy week, so I'll be attempting to summarize the points made. Please let me know if I miss anything.
I'm wary of enabling HDR by default on games that haven't been validated to look correct on those displays. That would be a compatibility breaking change for projects that update Godot with no good way to resolve it without obtaining an HDR capable display. I think we should continue to leave HDR off by default for now and have developers opt-in to the functionality. I hear your points on making HDR easy to implement and I believe the current defaults do that. In order to enable HDR support, you'd request HDR output for your main window, turn on
The "linear" mode works by making absolutely no modifications to the image before passing it to the Godot RD compositor (aside from any adjustments the environment may do). The intent here was to preserve backwards compatibility for games that are using viewports for visual effects (terrain rendering, shadows, particle collision, etc), not for display to a screen. It does have the added bonus though of not compressing the dynamic range of the underlying scene and passing it directly to the display.
In my Android implementation of this PR I noticed that, at least for pixel devices, 100 nits appeared to be the system SDR white point multiplier. I have no idea if this holds for macOS, since Apple only gives you the "headroom" between the current brightness of the display and its max. So far it seems like no platform really maps to the BT.2408 recommendation, but I have no ability to test the console platforms that may fit to broadcast standards closer.
I think that makes sense. I saw that Android calls it Max Average Luminance so I decided to follow that here, but I can see how that is confusing. I've added it to my TODO list.
I picked They suggest this format for games which seemed to fit Godot. I also wanted to avoid increasing memory usage for Godot unexpectedly. I'm happy to reconsider though with empirical evidence, I'll add to my TODO list to see if I can get some performance numbers on this.
Yep, I plan to resolve this. ClayJohn updated Godot to no longer require a restart to change In the implementation of this PR, there are 4 values that together indicate if HDR should be enabled:
As for Wayland, it seems like we should use
Thanks, I'll take a look at it and some of your results. I've been playing around with some ideas as well, but without much luck.
I've been getting that feeling too. I was hoping it might be useful for adjusting the shadows for the lack of low-end on the display, but I'm not sure it is worth it.
Yep, I'm noticing a ton of games are applying the Paper White / SDR White to only the UI. Unfortunately that is impossible to do within Godot due to how the canvas renderer is interleaved with the 3D renderer, especially when the canvas background mode is used. There is also no good way to separate 2D game elements from 2D UI, both look the same to the canvas renderer. So, I use the Paper White as a global multiplier to set color values of 1.0 across Godot to that Paper White value, then adjust the high end to make use of the rest of the range. In all reading I've done, it looks like Godot may be unique in this method. So far, it seems to work well in matching the brightness of other UI on my displays.
Yep, DirectX must be compiled into Godot on Windows to get any HDR information about the screen. Without it, or UWP, there seems to be no way to get that information via Windows APIs. The implementation currently allows Vulkan only builds to turn on HDR even if Godot cannot determine if it is present to unblock Vulkan only builds, but its not a configuration I'd recommend. The good news is that official builds of Godot have DirectX support compiled in, so most of the user base won't run into this issue.
I'm only able to work a few hours a week on Godot, usually during the weekends. Please don't read my lack of reply as a lack of interest. I think we all have the same goal here of making HDR support in Godot as seamless and good looking as possible. |
Well if you are busy at other times of the week there's nothing to be done about that but good to have your feedback on this discussion.
In both of our implementations this is configured by setting fn _ready():
# now
assert(ProjectSettings.get_setting("display/window/hdr/enabled") == true)
# stuff to make sure the window supports hdr
var window := get_window()
print(window.hdr_output_enabled) # true
# proposed
assert(ProjectSettings.get_setting("display/window/hdr/enabled") == true)
# stuff to make sure the window supports hdr
var window := get_window()
print(window.hdr_output_enabled) # false In both situations when
I accept that this is a reasonable concern. I also admit that I do not have the experience to determine if it is a justified concern. Therefore I will defer to @Riteo and @Zamundaaa, is this a justified concern? hdr on sdr monitors working on kwin is not enough, can we be confident all compositors will have a sufficiently robust implementation? The reason I am asking here and not in the wayland thread is that for us to allow hdr on sdr monitors and automatic sdr -> hdr it requires changes to this api. Otherwise it seems like our only option is to do automatic (or explicit) hdr on monitors which prefer an hdr colorspace.
Here is probably where my inexperience of hdr game developement is most prominent. If a developer has an hdr monitor and calibrates the game to look good on such a monitor (through project settings lets say) there is no guarantee a user has the same monitor and the game will look the same on the user's monitor. Therefore I believe
If a developer has explicitly enabled hdr on a window (through scripting not through the project setting) then explicit luminance values are used and if missing fallback to system luminance values. If implicit hdr has been enabled (the project setting) explicit luminance values are ignored and system luminance is preferred, falling back to project luminance values if the system cannot provide them. This has many of the properties of |
This is a justified concern because for gaming especially, ideally we adhere to HGiG guidelines well. If HGiG is set up properly, it basically means the OS shouldn't touch the image, the GPU shouldn't touch the image, the TV/display shouldn't touch the image: the game engine should have full authority and final say over output, and assumes responsibility for toneppping towards the display's specific max nits and color space.
HDR grading for games doesn't work that way, values aren't static. You can try this out in games that have HDR like Cyberpunk - the max nits the game tonemaps for is a changable value that the user can set in the game options. So is "paper white" or "midpoint" which is basically the average brightness the tonemapper tries to expose for, and is usually also the brightness of UI elements. And so is the color space, which in practice is pretty much just choosing between Dolby Vision and HDR10/10+. Even in games that don't provide HDR options, the engine will (or should) try to autodetect the display's max nits and adjust, internally, the tonemapper's max nits, and the display's colorspace and adjust, internally, the engine's colorspace. |
Yes we should prefer dynamic metadata to static metadata when setting hdr luminances. Dynamic metadata being system/screen luminance and hdr api. Static metadata being the project settings and bt.2408 recommendations.
The current implementation does not prefer dynamic to static.
With screen luminance disabled
`bt.2408 < project settings < hdr api`
System luminance is just completely ignored when screen luminance is disabled
With screen luminance enabled
`??? < system luminance`
When screen luminance is enabled the hdr api is mostly disabled (just raises an error an returns) but I don’t see an actual fallback for missing values
The previous message has what I propose and how it solves fallbacks for missing system luminances
|
Have you spent much time experimenting with this PR and other modern games with HDR support on Windows 10 or Windows 11 yet @ArchercatNEO? |
I must admit my experience with hdr before trying to implement it in wayland was 0. I haven't actually tested either PR, played an hdr-capable game or have an hdr monitor. I just keep hearing about people wanting hdr in wayland and that godot hadn't implemented hdr yet so when the protocol was finally merged I decided I could try my hand at implementing it. My perspective on how developers/users should do hdr then is ideally they shouldn't, instead godot and the system compositor should have a solid enough implementation that developer effort and user calibration aren't required. That doesn't mean I think hdr settings menus need to be eliminated, user configuration should always take priority over what we can do at the engine level but I'd like to have an implementation where it isn't required for godot games to look good on hdr monitors. |
Ok, the reason I asked is that I think it would be most helpful to hear your thoughts in the context of specific behaviours that you have experienced when trying out this PR on Windows, but if you don't have the hardware on hand to test this, then it sounds like this might not be an option. |
The warnings are mostly there to let the developer know that setting luminance values has no effect if they've delegated handling luminance to Godot. The choice has been distilled down to:
I'm not understanding why this is incomplete. What scenarios are not supported? |
Oh, I think I see a source of confusion here. The project settings only affect the main window and some viewports in the Editor to enable seamless authoring. It does not control "HDR" globally or enable it to be used. The developer can always ignore the project settings and go enable HDR on their main window and it will work the same way in game. Like many project settings in Godot, it is primarily there as a convenience for getting the most common scenarios setup. |
When I read |
f49a1f6
to
d68a6be
Compare
Yeah maybe that was probably me thinking too fast. There is the footgun of disabling screen_luminance without setting values defaulting to project settings but that's a strech. Essentially the only things my suggestion would add is being a bit simpler (in my opinion) as it allows to remove the |
d68a6be
to
fa2fa6d
Compare
A couple quick updates:
|
This is misguided - unfortunately HDR isn't just some feature that you can flip on to make your SDR game prettier - it's a different presentation that fundamentally requires developer attention. You can literally see this in @allenwp's updates: you can't just take ACES/AgX SDR tonemapping output and map it to HDR - it fundamentally requires a different tonemapper. This and many others is a big reason why even AAA studios often get their HDR wrong. In addition, proper game HDR usually requires the compositor (along with the display itself) to butt out and leave the game engine alone. Any interference tends to cause issues. (especially since the HDR stack is still quite brittle even nowadays) Trying to make it so that users don't have to configure HDR is a good goal - but even then, that's achieved by making sure the engine can reliably get correct information about the display's colorspace and max nits - so that the developer can then implement functionality to take care of HDR configuration for the user. You can't divorce the developer's hand from HDR - they're fundamentally going to have to get their hands dirty to implement HDR - so the engine's responsibility is really just to provide the developer with the tools needed for them to do their job. What you seem to want is an auto-magic SDR to HDR feature. The good news is that does exist. Windows offers it as Auto-HDR, NVIDIA offers it as RTX HDR. However... if Wayland/Linux wants this, it's on them to implement it on their side. This is actually a compositor/driver side feature, not a game engine feature. You also seem to want an auto-magic HDR to SDR feature. As I mentioned before, no one has implemented this for games because it doesn't make much sense - if the engine knows it's outputting to an SDR screen, why wouldn't it just use its SDR pipeline? The output will be much higher quality than a converted HDR to SDR image. That being said, if for whatever reason you still really want this, this can also be implemented as a compositor/driver side feature. The only existing implementations to reference are in video players like VLC, since auto-magic HDR to SDR is only useful for media with static, baked data, that won't necessarily have SDR data, like video and images. |
I would have an easier time accepting this if the automatic sdr -> hdr features didn't exist. auto hdr and rtx hdr are compositor side yes but why wouldn't we be able to do precisely what they do ourselves? Also what about use_screen_luminance? That is explicitly about releasing control of hdr to the engine so that it may do sdr -> hdr. Do you propose we remove this feature because for good hdr the developer must have been involved?
There are 2 ways I could interpret this. Godot renders something in hdr and then tonemaps to sdr internally, which we already do. When importing hdr image formats I believe godot does just clip the hdr to an sdr range. I could also interpret it as we send hdr to the compositor and expect the compositor to be able to convert it to sdr for the monitor. This is precisely a compositor feature like you said and the spec for wayland says that we may depend on the compositor to have this feature. We wouldn't automatically do hdr output on sdr monitors because that's not efficient but wayland has this feature so I don't see why we should actively prevent the developer from doing it on wayland. Automatic hdr wouldn't do it on sdr monitors and we can just add documentation saying that explicit hdr (rather than the implicit, automatic hdr) should be exposed as settings to the user at which point it's up to the user to decide if hdr happens on their sdr monitor or not. If Windows doesn't have this feature, then it doesn't and we aren't required to allow it. From the original proposal godotengine/godot-proposals#10817
The way I interpret this is that godot has been doing hdr rendering for a long time and has been tonmapping hdr to sdr for a long time. Is this the automagic hdr -> sdr you were talking about? Maybe the rendering pipeline changed since the proposal was made? |
Well actually yes, we could do it. Could be a nice feature for devs that don't want to do the legwork for proper HDR. I will say that the fundamentals of Godot's HDR will have to be finished and released first because this feature will require that to even work in the first place. But after all's said and done, we could create a separate proposal/PR to implement this.
I think there's some confusion here... I apologize. This is a common pitfall when talking about HDR. Let me clarify. There is "HDR" in the "Half Life 2 Lost Coast" sense - this is talking about how games, internally, represent the scene with HDR data, so that they can do fancy stuff (at least for the 2000s) like bloom and eye adaptation. We're not talking about that, although it is related. Regardless of whether the engine supports and outputs HDR or not, internally, the engine represents the scene in "HDR" either way, pre-tonemapping. This is how it's been since the 2000s. When we talk about HDR in the here and now, we're talking about HDR displays, and the pipeline from game output to display. The HDR10 spec basically. (And potentially HDR10+ and Dolby Vision) So when I refer to an "auto-magic SDR to HDR feature" like Windows Auto-HDR, what that does is, the compositor takes the SDR output of a game that has no HDR capabilities, or has explicitly turned off its HDR capabilities, and does realtime inverse tone mapping on that output. The compositor then takes the result of that and presents it to the display as HDR data. When I refer to an "auto-magic HDR to SDR feature", this is taking HDR data and tonemapping that down to SDR in realtime, then presenting that SDR data to the display. This can currently only be found on video players (AFAIK), but theoretically how it could work with games is, let's say we're on a system that doesn't have HDR capabilities. If we have a game that has HDR capabilities, the compositor can "lie" to the game and say the system has HDR capabilities. The game then sends out HDR output to the compositor. The compositor will then take that HDR output, tonemap it down to SDR in realtime, and present that SDR data to the display. I apologize if it's still not clear after this, I fully admit HDR is honestly a mess and confusing as hell, and developers are only now starting to grok all of it. There's a reason why a whole decade after the standard was introduced, only now are game developers starting to embrace it. |
So, on a game that doesn't support HDR:In engine:
In compositor:
If HDR display is connected and HDR is enabled for the OS:
(You can just composite SDR images into an HDR buffer with no tonemapping or conversion required, because SDR is a subset of HDR) On a game that does support HDR:In engine: Engine asks compositor/OS if system supports HDR. If compositor answers no, engine will just do the SDR pipeline. If compositor answers yes:
In compositor:
|
This is where our current work lies:
"But wait a minute, this doesn't make sense! It's already in HDR, so why does it need to tonemap to HDR?" Ah, therein lies the rub. The "HDR" representation that engines use internally is actually higher range than current HDR display standards! In addition, as I'm sure you know, HDR displays range from 400 nits to 1000+ nits. Tonemapping is still required, and this is where values like "Then what's the point of the HDR options offered in the game's settings menu?" User control over the tonemapper behavior, basically. For example, maybe, even if your TV supports 1000 nits, you only want to use 400 nits of brightness range. Maybe the TV is lying to the system about its actual capabilities (some older cheapo TVs did this). Maybe you prefer the TV's tonemapper - for example, you disable HGiG mode, set the game to output full 1000 nits even if the TV only supports 600 nits, and leave it to the TV to tonemap that down as it sees fit. Maybe you want to choose between HDR10+ and Dolby Vision. "Where would the auto-magic SDR to HDR feature lie in all this, if we were to implement it?" In engine: Engine asks compositor/OS if system supports HDR. If compositor answers no, engine will just do the SDR pipeline. If compositor answers yes, and developer enabled the auto-magic SDR to HDR feature:
In compositor:
|
On HDR displays, Windows 11 still gets sRGB wrong, and many HDR displays are terrible themselves, so I understand your hesitation, but things work differently on Wayland. Colorspaces are very strictly defined, and conversions between them are simple mathematical formulae that so far not a single compositor has gotten wrong. Preventing conversions is a good thing in general, targeting the compositor's preferred colorspace is absolutely the right move for performance reasons, but assuming that the entire rest of the stack is completely broken and that you need to attempt to work around it before even seeing a single actual issue is a terrible approach. You won't even always have a choice - in many cases, KDE Plasma will tell you to target a colorspace that uses the native primaries and whitepoint of the display + a gamma 2.2 transfer function with HDR headroom, rather than some standard HDR colorspace. Sometimes this is for compositor-side optimizations, but sometimes it's also just how the display works: The Steam Deck OLED for example uses a conventional gamma 2.2 transfer function. Unless you add support for that, using Wayland instead of Vulkan to tell the compositor that you're targeting this special color space (which would be cool tbf, and could prevent color conversions on SDR displays as well), you're just gonna have to trust the compositor to convert whatever colorspace you use to what it actually needs.
I think the above answers that question, but to expand on that, I would recommend you to detect if you should prefer HDR by checking Allowing the user or developer to manually enable HDR even when that's not the case might be nice to test the compositor's tone mapping capabilities, but isn't something you really need to care about.
That's not right. Static metadata - or more specifically the part of it that's called the mastering display information - describes the display the content was made for. On Wayland, the preferred image description tells you about the display to target, so assuming you tonemap for that, you should also use those same values as the static HDR metadata, which tells the compositor that you're targeting the display you're supposed to, and that it shouldn't do any tone mapping. If you don't target the system provided values, then the luminance ranges the developer manually set up should be used - representing the display they optimized the game for - and the compositor will tonemap it to the display it actually gets presented on. Dynamic metadata would give the compositor more information about the actually used brightness ranges in a given scene, which may be lower than what the target display is capable of. As this is for better tonemapping, which you want to avoid in most cases anyways, and Wayland doesn't even have an extension for dynamic HDR metadata yet, you really don't have to worry about it right now.
I think it's more helpful there to talk about the meaning of the brightness values rather than just their range. The internal data is normally scene-referred, it represents the brightness that things would have in the real world (assuming a "realistic" scene), and the output data takes into account the light adaptation of the viewer, and as you said of course the capabilities of the display. |
fa2fa6d
to
b5a3369
Compare
Applied some of the tonemapping changes that @allenwp was playing around with since they seem to improve the result significantly. Also gave me a chance to refactor the tonemap shader to remove parameters I was no longer using. |
Thanks! I wanted to adapt your glow changes to this approach to review how it would behave with the different glow modes and a "Linear" tonemapper. Sometimes glow is used to simulate a physical effect (fog glow around a bright point of light), other times it is used as a psychological/perception trick to make part of an image appear brighter than the display can reproduce, or it can even be used as a stylistic effect with no physical or psychological basis. I will reiterate that the intent behind this approach is to demonstrate that it is not possible to adapt any of the existing filmic tonemapper equations to HDR, as the contrast curve is not correctly scaled. Because Reinhard does not have a "toe", it is reasonable to adapt it to HDR using this approach, but it breaks when (A good example of when this breaks is setting Reinhard |
Very well. I still believe separating "explicit" hdr (the actual scripting hdr api) and "implicit" hdr (the project setting) would be more flexible (and allow us to distinguish between monitors which both support and prefer hdr from those which support hdr but do not prefer it, likely because of a compositor feature) but since this is my primary usecase if we decide we really don't want this and nobody has a different usecase for making the distinction then my suggestions can be ignored.
Yeah my bad. I shouldn't have used "static metadata" and "dynamic metadata" here since they already have standard definitions in this context. I was using dynamic as runtime luminances, ie luminances we got while the game was running. The luminance values in this category then were the "system/screen" luminances which for most situations should be the ones we prefer and values decided either in a settings menu or from something like a colorspace picker in an artistic app. For the settings menu the loss of performance seems fine to me since the user explicitly enabled the setting and the artistic app needs that kind of control for mastering and such. Really my version of "dynamic" luminances just meant the system luminances in most cases and luminances we got from some other source not decided by the developer, the user, an image's hdr metadata. I also said "static metadata" because it didn't change at runtime which was also a bad move. It's just the bt.2408 luminances and the project settings. Although thinking about it maybe we could provide mastering display information somewhere, maybe from the project settings themselves even. But I won't push too much on that, if it turns out mastering display metadata is useful we could possibly return in a later pr. |
Well I have some code examples of what my wayland impl is, what I'd like it to be and why it can't be implemented like that with the current state of this pr. Maybe it will make the usecase clearer maybe it won't. Current bool DisplayServerWayland::screen_is_hdr_supported(int p_screen) const {
...
// as per zamundaaa's suggestion
return wayland_thread.supports_hdr() && (screen->color_profile.max_luminance > screen->color_profile.sdr_white);
}
void DisplayServerWayland::_window_update_hdr_state(WindowID p_window) {
...
bool hdr_preferred = window->preferred_color_profile.max_luminance > window->preferred_color_profile.sdr_white;
bool hdr_enabled = rendering_context->window_get_hdr_output_enabled(p_window);
bool hdr_desired = wayland_thread.supports_hdr() && hdr_preferred && hdr_enabled;
} (the hdr state is just The thing I would like to draw your attention is how to actually enable hdr on a window we check if a window's preferred profile meets the What I would like to use bool DisplayServerWayland::screen_is_hdr_supported(int p_screen) const {
...
// as per the wayland spec
return wayland_thread.supports_hdr();
}
void DisplayServerWayland::_window_update_hdr_state(WindowID p_window) {
...
bool system_hdr = GLOBAL_GET("display/window/hdr/enabled");
bool hdr_preferred = window->preferred_color_profile.max_luminance > window->preferred_color_profile.sdr_white;
bool hdr_enabled = rendering_context->window_get_hdr_output_enabled(p_window);
if (system_hdr && hdr_preferred && wayland_thread.supports_hdr()) {
//use screen luminances
} else if (hdr_enabled && wayland_thread.supports_hdr()) {
//use developer-set luminances
} else {
//disable hdr
}
} Why can I not do this? It's because the current implementation makes all windows request hdr turning most of this into just There are 2 other alternatives for the consistency problem:
If I still haven't convinced anyone chances are the inconsistency probably won't be a problem. I cannot guarantee that it won't be a problem but I don't predict many compositors making the preferred profile really different to the screen profile very often (though now that I think a bit more, compositors can put windows in several screens which could make this quite a bit more complicated). |
Regarding ACES 2 support: I had a discussion with one of the members of ACES about how things should be handled with an operating system that scales its SDR content when displaying it in an HDR signal, instead of pinning SDR content at exactly 100 nits. https://community.acescentral.com/t/aces-2-in-game-engines-variable-reference-max-luminance/5734 The summary is: there is no recommendation on how to deal with this and it would need to be researched. Regardless, I think I came up with a way to best handle this by using I could say that a truly correct ACES 2 implementation would force I could go even further to say that Godot should always force |
b5a3369
to
137b53c
Compare
Co-authored-by: Alvin Wong <alvinhochun@gmail.com>
137b53c
to
e112064
Compare
Implements: godotengine/godot-proposals#10817 for Windows.
Overview
This PR enables the ability for Godot to output to HDR capable displays on Windows. This allows Godot to output brighter images than allowed in SDR mode and with more vibrant colors.
Testing/Sample project: https://github.com/DarkKilauea/godot-hdr-output
HDR (higher bit depth image, may not display correctly on all browsers):

SDR:

Sponza (HDR, higher bit depth image, may not display correctly on all browsers):

Sponza (SDR):

Supported Platforms:
Supported Graphics APIs:
Supported HDR Formats:
Features:
Quirks:
Follow up work:
Open Questions:
Usage
Project Settings
display/window/hdr/enabled
:rendering/viewport/hdr_2d
:display/window/hdr/enabled
will not require a restart.0.5
and exposure of3.0
works well). For 2D content, use colors that exceed 1.0 for a channel.Runtime
0.5
and exposure of3.0
works well). For 2D content, use colors that exceed 1.0 for a channel.Help Needed
Please give this a test, either with the linked sample project or with your own projects, and give feedback. Specifically I'm looking for input on how easy this feature was to use and if you encountered any issues with your particular display, OS, or driver configuration.