-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
On X11, if RANDR based scale factor is higher than 20 reset it to 1 #1403
On X11, if RANDR based scale factor is higher than 20 reset it to 1 #1403
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think it should be Winit's job to work around bugs in NVIDIA's proprietary driver. RANDR misreporting values should be filed with them.
Well, then all clients should do the same job as e.g. alacritty here and filter out bad values from X11, etc. It doesn't happen all the time and also monitor could report wrong values, so not NVIDIA binary driver is at fault here. The RANDR based calculations isn't common for X11 and I don't think that anyone is using them by default. However winit enforces it on a users by default, so if something all downstream clients should workaround the problem. Also you're already filtering out some RANDR metrics, so how extra large DPI differ here? Just think of it like your monitor returned a bad value instead of something normal, like 16mmx9mm. So you're not working around bugs in NVIDIA driver at first place, you're also working around bad monitors, etc. Alternative, don't use RANDR calculation by default. |
@kchibisov The original issue mentions that the GTX1050 ran fine with Nouveau and reported proper values, while the 2050 w/ the proprietary driver did not run fine. I'm guessing they tested with the same monitor, so it's either a driver issue or a hardware issue that NVIDIA's drivers should be working around. Either way, if this was, for example, the monitor misreporting the data, that still, IMO, wouldn't be Winit's job to correct. If a monitor wants to report a DPI of 720, it can go right ahead. It is up to clients to decide if they trust the number being forward by Winit. EDIT: Or alternatively, the NVIDIA driver providing this monitor size data should choose to correct it if they note something wrong. |
I'd agree with you, if winit wasn't a cross platform toolkit, but X11 only toolkit, but its not an X11 only toolkit, so this will enforce alacritty(not only alacritty, I think) to have a special case for X11 in its codebase and make things harder to maintain, because winit is fine with reporting odd values from xrandr. Than, also don't use other quirks for randr, like you're checking 0mm sizes and filtering them out. Maybe monitor is fine with reporting 0. Also we had already similar issue in the past about the xrandr here. and some distros already referring to it, because or default behavior isn't something normal for X11 apps. https://wiki.gentoo.org/wiki/Alacritty So if you really don't want to filter out things, just don't use randr by default like Qt is doing with its environment variables. e.g. |
@kchibisov I think there is a major difference between filtering out 0x0mm monitors and filtering out numbers larger than an arbitrary cutoff. The former is a physical impossibility, as monitors have to be larger than 0x0mm, and so winit can confidently say that a 0x0mm sized monitor is a bug. The cutoff, on the other hand, is something arbitrary that should be decided by clients based on their use cases. I'd like to point out that the proposed cutoff point of 20 would not have resolved any of the problems mentioned by users in alacritty/alacritty#1501.
Afaik, we check for the To get a scaling factor of 720 on a 5k monitor, NVIDIA's proprietary drivers would have be reporting a monitor surface area of 1.39mm^2. I've never see a monitor with a faulty EDID that's reporting something that small. If NVIDIA's proprietary driver is shitting itself at the simple task of reporting semi-realistic monitor sizes, users, as I mentioned at the start, should file a bug with them. Until then, users can use the options above to work around their broken driver. Downstream libraries, like winit, shouldn't have to work around NVIDIA's proprietary driver's inability to provide realistic screen sizes. |
I'm using this as an argument to switch default behavior on X11, to use randr computations only with WINIT_X11_SCALE_FACTOR=randr, like Qt is doing, so user wont be surprised that App blowups, since they should manually toggle this behavior.
I'm trying to say, that I don't like when users should tweak the env variables to make apps run. This is why I'm referring to Qt here, since you can opt-in to auto dpi, not opt-out like with winit.
I know, insane thing, but then clients should workaround values from winit. Though I'm not insisting to filtering out. I'm fine if winit switches the default behavior to report DPI of I don't have that much to say on this issue, we can just filter out things in alacritty, this is not the end of the world if winit decides to pass broken values to us. |
Just to make it clear. I'm fine if you decide to avoid filtering DPI values from randr, but in this case, I feel like winit should follow Qt here and switch default behavior to match Qt. By this I mean the following.
IIRC some systems could be without RANDR, so maybe it's not a bad idea to follow Qt here? However I'm not familiar with X11 that much. Anyway, feel free to close this PR if you don't like the suggested idea, I don't have much arguments left. |
Any system which has updated it's Xorg since after 2010- hell, since after 2008- should reasonably support RANDR. I will note that I have added Xinerama and multihead support here, though, for those interested.
I'd like to hear what @murarth's opinion on this mater is first, they are the one maintaining the platform after all. |
@murarth Do you have any opinion on a problem that was brought here, sorry for pinging, just don't want this thing to stall for a long. |
I usually stay out of discussions about DPI matters because I don't use any high DPI displays and am not familiar with the expectations of users or application developers with regard to high DPI monitors. So, I'll try to speak generally about my perspective on this issue and leave room for the input of others.
If this is something application developers may prefer, I think an issue should be opened to propose this change and get further feedback from application developers, with special consideration for how this may affect platform parity.
I'm not opposed, in principle, to accepting a workaround such as this. winit has a few workarounds for bugs or unusual behavior of window managers, so I don't think it's unreasonable to expect a workaround for buggy behavior from some other software. However, I agree that using The only other option appears to be checking for very small screen size. In one of the above mentioned issues, a user reported that If there's no reliable way to identify this buggy behavior that won't be prone to both false positives and false negatives, it may have to remain unresolved and application developers will have to continue instructing users to use |
I've pinged you because you were asked to give some opinion on this by @goddessfreya , no more no less. The issue isn't related to DPI, but rather to strange behavior of randr in some cases. So it could happen even on LowDPi monitors, since randr can report whatever it wants.
Fair enough, I'll open an issue about it at some point.
This is what I'm trying to say here, you shouldn't force solving platforms specific bugs in cross platform toolkit on application developers, since it isn't going to end good. Since as app developer you should have a special check on some platforms that DPI make sense, instead of using it right away, which likely lead to a wrapping of call and things to be overly complex, like you can call scale_factor more than once in your app and you should workaround every call to it...
The value 20 was picked, because the DPR for 8k @ 5 inch is ~18.36 . So it does the same thing as you've said about filtering some small screens, but in a bit different way, so you don't match by screen size, but by DPR of some theoretical small screen with large DPR, since screens with bigger DPR are likely has less diagonal/larger resolution. Screens with dimensions 160mmx90mm are also strange for X11, so maybe filtering by some physical metrics could also work (let's say less than 200mmx200mm). This won't solve issue in general, but make winit workaround some completely odd values. |
Filtering for screen size strikes me as a really bad idea. There are many monitors, laptops, touchscreens smaller than 10". |
I feel like we misspoke a bit here, since I was voting for 20mm x 20mm, and just followed 160mm x 90mm, so ended up in a bad metrics, since original issue was with metrics like 16mmx9mm. |
There are definitely android devices with screens smaller than 20x20mm. |
I doubt you wont run X11 on this thing, but that's why I've initially suggested DPR based filtering and not metrics based |
Some video drivers could set display metrics to odd values, which can result in extra large scale factors (e.g. winit is sending 720 for 5k screen on nvidia binary drivers), so let's just drop them to prevent clients from using them. The value 20 was picked, because the DPR for 8k @ 5 inch is ~18.36.
42c9c32
to
875af50
Compare
reviewer isn't available anymore.
covers #1931 |
Some video drivers could set display metrics to odd values, which can result in
extra large scale factors (e.g. winit is sending 720 for 5k screen on nvidia
binary drivers), so let's just drop them to prevent clients from using them.
The value 20 was picked, because the DPR for 8k @ 5 inch is ~18.36.
cargo fmt
has been run on this branchcargo doc
builds successfullyCHANGELOG.md
if knowledge of this change could be valuable to usersThis PR is more of a proposal. The original issue was here alacritty/alacritty#3214 , as you can see in a log, DPR from winit is 718, which is literally insane. I see 2 options on how to fix such issues.
1 - Make winit compute DPI only if a randr value in use, otherwise return just 1 all the time. Qt is probably doing something similar IIRC
2 - Just try to ignore odd values (this PR is doing it).
The proposed
1
change is a breaking change, so I guess2
option should fix some problems right now, however I feel like we should also implement1
at some point to be morenative
, since auto DPI adjustments aren't something common for X11.