Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I’m way out if my depth here, but isn’t certain things faster to render on the CPU still? I believe it is mostly vector graphics - I have seen a few papers showing that it can be done fast(er?) on the GPU but that may not be that widespread yet.


> but isn’t certain things faster to render on the CPU still?

It really depends on the context.

If you look at it in a certain light, fonts are just "vector graphics" and there are ways to do that without using SDF (signed distance fields) and get really high quality precisely positioned fonts fully rendered on the GPU.

An example of that is the GLyphy project and we have a branch which can render fonts using that with GTK 4. It's a bit slower, but better quality and still easily able to handle 144hz full screen text scrolling. It requires some preprocessing before uploading glyph curves to a data atlas on the GPU.

For other shapes, there are options, but given GTK has so few upstream developers, they aren't quite ready for prime-time so they'll likely be part of GTK 5 in the form of "GskPath" if you feel like browsing the branches with that.


Windows 95 rendered its UI on CPU and it was fast. You need GPU for big bitmaps and useless effects. For square with border and text CPU is more than enough. Then you send this small square bitmap to GPU to add shadows and whatnot to it.


Windows 95 doesn't even support Unicode. The level of UI you can draw there is a very small subset of what you can do in say Chrome.

Indeed correct unicode text rendering is one of the difficult problems. It's the same thing that is missing from basically all games. So when people say stuff like games can run at 120fps, how hard can UI be? it's basically the same Windows 95 argument. The UI work being done is a small subset.


Not supporting Unicode is a feature for me. I have absolutely no use for Unicode, but it is a constant source of security vulnerabilities.


Yeah, let’s ditch all those other pesky languages and just stick with English /s


You’re in the minority with that. Even Americans love emoji.


What does it have with GPU rendering? GPU does not support unicode either.


Modern GPUs and CPUs are both Turing complete. They both are capable of supporting and indeed do support Unicode software.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: