Fractional scaling, by definition, will make stuff blurry. Clean bitmap graphics and fractional scaling are incompatible requirements.
Proof: consider a pattern of alternating black and white pixels: 0, 255, 0, 255, 0, 255, etc. This pattern cannot be scaled to a non-integer factor without introducing gray colors (i.e., by blurring-out the pattern). If your app produces this pattern, then fractional scaling will blur your app.
That is not the case on wayland. Wayland applications with support for the fractional scaling protocol can render without any blur at any fractional scale.
This is because the protocol negotiates the size of the underlying buffer. If the client and the compositor agree on the scale, then no scaling of the buffer will happen in the compositor because the client has attached a buffer that is the exact pixel size size of the window on the physical display.
It is up to the client how it implements the case you described. E.g. it could alternate between 1 black and 1 white pixel in the physical buffer or it could sometimes make two adjacent pixels the same color.
Source: I was involved in the design of this protocol and we had example clients that used this exact pattern. Chromium also supports this protocol without any blur.
This is not true in any sense, but it is what some people think when they don't know much about the underlying principles and just see fractional pixels.
A blur would be lowering the frequency of an input signal, anti-aliasing is representing that signal more accurately when quantizing it into discreet values.
Do some animated aliased 3D renders then try to blur it to get the same result as an anti-aliased version.
Look at a checkerboard pattern as it goes into the distance. The pattern eventually converges into grey if it is antialiased because the integral of everything under the pixel is grey as the squares end up smaller than a pixel. Blurring the entire frame gives a much different result.
IMO, it looks like exactly what you'd see if you took pixel perfect rendered text and applied a small-radius Gaussian blur to it. It might look different on your screen, however (monitor settings can affect rendering quite a lot).
Not everything is a bitmap. Ordinary drawing operations operate on coordinates, so fractional scaling should not lead to any blur (although may miss some pixel-perfect designs).
In other words vectors may be scaled with little precision loss, or they may be scaled naively (render to bitmap and then scale the bitmap).
Yes but the fonts, boxes and lines are all fine. And on the web the image issue is usually resolved by starting hi res to begin with and sampling down not up.
When people complain about desktop apps not scaling they aren't complaining about the odd icon, it's the whole app looking like a smeary mess because it's not using vector based graphics. Fronts, grid lines etc. become blurry.
Also, don't you get the same effect (readable large fonts but alignment changes a bit) when you increase the size of your UI fonts, even without changing overall UI scaling?
Why are people (apparently) so attached to pixel alignments for OS-native GUIs?
Fractional scaling works perfectly well on Xorg Linux and Windows, but looks blurry on Wayland Linux. Maybe it's not micrometer-scale crisp, but I can't see that. Text on Wayland is very visibly blurred.
And it's not just text, but UI controls too. It looks like Wayland just renders the lower integer scale factor and then stretches the resulting bitmap image. That's bullshit.
MacOS way is better, text is blurry, but much less. I still don't like it though, but can use it in an emergency.
But there apparently is another way that Xorg and Windows uses. I have perfectly crisp (as far as my eyes can tell) UI and text on both systems at 150% scale (27" 4K display).
> for tech debt reasons
I thought Wayland was supposed to fix the tech debt - so now it introduced some that makes bare basic features impossible?
It does for any app that can't scale; all modern OSX apps can scale natively. I've been using that trick for integer-like scaling for years to deal with fractional scaling while preserving the quasi-aliasing ("crispness") of the source image.
However, Wayland does not prescribe any method for non-integer scaling. Any Wayland WM could choose to do the same thing, and it would be hardware accelerated essentially for free.
Both X11 and Wayland WMs typically don't use this trick, and neither does Windows.
It is exactly (some) Xorg apps that render blurry on Wayland. You are blaming the wrong party for “bullshit”. Xorg scaling sucks, whereas Wayland’s is great.
I don't know why this guy is being downvoted, he is correct.
You cannot scale things to non-integer amounts without incurring some damage.
Blurry/fuzzyness, ringing, screen door effect, etc, you cannot avoid these no matter how smart your scaling algorithm is as long as you're upscaling it to something inbetween 100% and 200%. Nyquist-Shannon is a bitch.
Wayland made a decision way back that non-scaling apps can only be integer scaled to avoid this defect. This was the correct decision, objectively. Unfortunately, people still choose to own monitors that have weird resolutions that do not approximate 96dpi after integer scaling.
Thankfully, 200% dpi screens (ex: 3840x2160 in 24", where 24" are normally 1080p) are starting to become the norm, so someday this problem will go away: you will always be scaling at least 200%, making non-integer scaling artifacts a lot less visible.
Also, I think the parent comment that enriquto replied to might be confused and is merely asking for nearest neighbor scaling. This is not part of Wayland (which is just a protocol), and is managed entirely by the WM being used. Given Wayland is trying to enforce integer scaling, WMs allowing choosing nearest neighbor when integer scaling would be preferable in many cases.
> Unfortunately, people still choose to own monitors that have weird resolutions that do not approximate 96dpi after integer scaling.
It’s not always a choice unfortunately. I buy displays that are capable of a clean 1x or 2x when I can, but there’s a ton of laptops that still need fractional scaling.
Take my Thinkpad X1 Nano. Great laptop in a lot of ways, including the screen (~500 nits brightness, excellent backlight consistency, color, and contrast, no glare) except that it runs at a resolution that requires 1.5x scaling to be usable.
Looking at replacement candidate laptops, the only ones that have 2x screens that aren’t a downgrade somehow destroy battery life (e.g. 3000x2000 OLED panel in Dragonfly Elite G4, which docks 3-4h of battery). 1x screens in this category for some reason are all kinda crappy with e.g. dim 350 nit backlights that start to struggle in a moderately naturally well-lit room, which is just goofy in a portable machine that’s likely to get usage in a bright environment.
This is one thing that MacBooks objectively do consistently better.
It's not really as necessary on PCs because of how Windows does scaling. It's only a problem in programs that just straight up don't support it. And Apple has routinely and still does ship laptops with non-integer scaled resolutions as default (e.g., the 12" MacBook, the 13" Macbook Air).
Proof: consider a pattern of alternating black and white pixels: 0, 255, 0, 255, 0, 255, etc. This pattern cannot be scaled to a non-integer factor without introducing gray colors (i.e., by blurring-out the pattern). If your app produces this pattern, then fractional scaling will blur your app.