To me one of the most influential pieces of writing about subpixel rendering and in particular an exploration of the ways Microsoft got it wrong was the writings by the late developer of Anti-Grain Geometry, Maxim Shemanarev (R.I.P.)
https://agg.sourceforge.net/antigrain.com/research/font_rast...
Though to be fair to this article, Microsoft did improve things with DirectWrite, and yes the situation on Linux is quite bad unfortunately.
Also bonus, a pretty great article here talking about gamma correctness in font rendering, an issue that is often somewhat overlooked even when it is acknowledged.
https://hikogui.org/2022/10/24/the-trouble-with-anti-aliasin...
Just some additional reading materials if you're interested in this sort of thing.
Subpixel rendering works completely fine on Linux. I'm using it right now, using "full" hinting and "RGB" subpixel rendering. It even works completely fine with "non-integer" scaling in KDE, even in firefox when "widget.wayland.fractional-scale.enabled" is enabled.
On the other hand, subpixel rendering is absent from MacOS and makes it very difficult to using regular ol' 1920x1080 screens with modern MacOS. Yes, those Retina displays look nice, but it's a shame that lower res screens do not because they work perfectly fine except for the font rendering.
My first (and last) 1920x1080 monitor was 50lb CRT I picked up on the side of the road, in 2003.
I haven't owned a smartphone with a screen resolution that low, in over 10 years.
I think it's an amazing feat of marketing, by display companies, that people still put up with such low resolutions.
It's still a perfectly serviceable resolution.
Of course 16:19 pushed down display costs leading to the demise of 1920x1200 which is unforgivable ;-)
Those 120 pixels were sorely missed.
You can still get 16:10, they're just classed as "business professional" models with matching price tag.
Buy them refurbished instead, then.
Hm... I am reading this on a 1600x900 screen of my T420s Frankenpad while sitting in dusk in a German campsite. I ordered the screen some 10 years ago off Alibaba or something, and it is exactly the resolution and brightness I need. I hope I will die before this Frankenpad, because contemporary laptops are awful in so many aspects.
You know... as you age, you really can't read all those tiny characters anyway.
It sounds like you have a proper computer anyway, do you really care about non-fixed-width fonts? These are office suite and web browser fonts.
If something needed to be rendered in some particular way, it should have been a PDF. For everything else there’s vim.
That's true that they aren't interested. But I still like such screens. I used one quite recently, and it worked just fine for my needs.
- [deleted]
A Full HD CRT from the roadside in 2003? As if this was just a thing people had happen to them? Is this some elaborate joke I'm missing?
> I haven't owned a smartphone with a screen resolution that low
Smartphone in italics, because smartphones are known for their low pixel densities, right? What?
Did you own a smartphone at all in the past 10 years? Just double checking.
> I think it's an amazing feat of marketing, by display companies, that people still put up with such low resolutions.
And how did you reach that conclusion? Did you somehow miss display companies selling and pushing 1440p and 4K monitors left and right for more than a handful of years at this point, and yet the Steam Hardware Survey still bringing out 1080p monitors as the king month to month?
Sometimes I really do wonder if I live in a different world to others.
> As if this was just a thing people had happen to them?
No, literally, on the roadside, out for trash. Disposing of CRT has always been expensive since they can't fit in the trash and taking them to the dump has a fee for the all the lead. At the transition to LCD, they were all over the place, along with projection TVs. There was also a lot of churn when "slimmer" versions came out, that mostly halved the depth required. Again, it was literally 50lbs, and about 2ft in depth. It took up my whole desk. It was worthless to most anyone.
> Smartphone in italics, because smartphones are known for their low pixel densities, right? What?
Over 10 years ago I had an iPhone 6 plus, with 1080p resolution. All my phones after have been higher. Their pixel densities (DPI) are actually pretty great, but since they're small, their pixel counts are on the lower side. There's nothing different about smartphone displays. The display manufacturers use the same process for all of them, with the same densities.
I think the italics are because it's so weird that most people have more pixels on the 6" display in their pocket than on the 24" display on their desk.
CRT resolution was moreso limited by GPUs than the monitor itself. They don't have fixed pixels like LCD/OLED.
> GPUs than the monitor itself.
No, it was limited by the bandwidth of the beam driving system, which the manufactures, obviously, tried to maximize. This limit is what set the shadow mask and RGB sub pixels/strip widths. The electron beam couldn't make different color, different colored phosphor patches were used.
But, since bandwidth is mostly resolution * refresh, you could trade between the two: more refresh, less resolution. More resolution, less refresh. Early on, you had to download a "driver" for the monitor, which had a list of the supported resolutions and refresh rates. There was eventually a protocol made to query the supported resolutions, straight from the monitor. But, you could also just make your own list (still can) and do funky resolutions and refresh rates, as long as the drive circuit could accommodate.
This monitor could do something like 75Hz at 800x600, and I think < 60 at 1080p.
Yeah, DDC and EDID were standardized in '94, and were widely available and working well by '98 - if you were on Windows at least, running fresh hardware.
> This monitor could do something like 75Hz at 800x600, and I think < 60 at 1080p.
Assuming both modes were meant with 24-bit color ("true color"), that'd mean 17.36 Hz tops then for the FHD mode, ignoring video timing requirements. I don't think you were using that monitor at 17 Hz.
Even if you fell back to 16 bit color, that's still at most 26 Hz, which is miserable enough on a modern sample-and-hold style display, let alone on a strobed one from back in the day. And that is to say nothing of the mouse input feel.
- [deleted]
I got a 21" Hitachi Superscan Elite or Supreme around that time from a gamer.
Because that thing could only do the BIOS text modes, and standard VGA at 640x480 at 60 or 70Hz. Anything else just showed OUT OF SYNC on the OSD, and then switched off.
Except when you fed it 800x600@160Hz, 1024x768@144Hz, 1280@120Hz and 1600x1200@70 to 80Hz, or anything weird in between.
I could easily do that under XFree86 or early X.ORG. A gamer under DOS/Windows rather not, not even with Scitech Display Doctor, because most games at that time used the hardware directly, with only a few standard options to chose from.
OUT-OF-SYNC zing
Was nice for viewing 2 DIN-A4 side by side in original size :-)
Fortunately a Matrox I had could drive that, as could a later Voodoo3 which also had excellent RAMDACs and X support.
That sounds weird and fun, although I can't seem to find the pattern that would have resulted in those numbers. 1024×768@144 (8bpc) works out to 340 MB/s, while 800×600@160 (8bpc) works out to just 230 MB/s, should have been able to refresh even faster. Or is that some other limitation that's not bandwidth? [0]
Was a bit surprised by that double A4 thing btw, but I did the math and it checks out - paper is just surprisingly small compared to desktop displays. Both size and resolution wise (1612×1209 would have put you right up to 96 ppi, with regular 1600×1200 being pretty close too). It's kind of ironic even, the typical 1080p 24" 60 Hz LCD spec that's been with us for decades since just barely fits an A4 height wise, and has a slightly lower ppi. Does have some extra space for sidebars at least, I guess.
[0] Update: ah right, it wasn't a pixel clock limit being ran to the limit there, but the horizontal frequency.
That are the resolutions and frequencies I do remember having tested without trouble. Indeed I could go even faster on the lower ones, but didn't dare to for long, because they sometimes produced very weird high-frequent noises, sometimes unsharpness, and I didn't want to break that wonderful piece of kit.
Did care about 1600x1200 at then 75Hz mostly. All that other stuff was just for demonstration purposes for other people coming by, or watching videos fullscreen in PAL.
It seemed to be really made for that resolution at a reasonable frequency, with the BIOS & VGA thing just implemented to be able to see start up, changing settings, and all the rest just a 'side-effect'.
They still had very real limitations in terms of the signal they accepted, and color CRTs specifically had discrete color patches forming discrete, fixed number of triads.
> I think it's an amazing feat of marketing, by display companies, that people still put up with such low resolutions.
Stereo audio is still fine even though 5.1 exists
300 dpi printers are still useable even though 2400 dpi printers exist
double-glass windows are still fine even though triple-glass windows exist
2-wheel drive cars are still fine even though 4-wheel drive cars exist
Just because something new appears on the market that new thing does not need to take over from all predecessors when those predecessors are good enough for the intended purpose, especially not when the new thing comes with its costs - power use and higher demands on GPUs in case of display with higher resolutions than really needed.
And feet are fine, even though shoes exist.
Fire is fine, even though ovens exist.
We're animals, that are perfectly fine living naked in the wild (some still do today). It's all complete excess. Feel free to abandon the progression of tech, but, I challenge you to use a modern panel for a couple months, then try to go back to 1080p. It's like the console players who claimed 30fps was serviceable, fine. Sure, but nobody wants to go back to 30fps after they've used 60hz, or 144hz, for a non-negligible amount of time.
I also use a 1080p from time to time, it's servicable, but it's not comfortable, and it provides a far far inferior experience.
It works on Xterm for me, I didn't enable anything special except using a vector font.
That's something that OSX doesn't even have now.
The article is from 2018 and that should be mentioned in the title
I would love to see an update on what has improved and what is the same
Yeah even the flag they are talking about doesn't exist in Chrome anymore. Skia is the only text rendering I ever suffer under Linux, so whether or not Skia works properly is the only thing that makes a difference to me.
On hi-res screens most of it is irrelevant
Yes definitely. I stopped reading after the OS X section because it was clearly talking about a different era.
One important detail is that fonts themselves have their own hinting rendering tables so authors can decide how fonts will be rendered on low dpi screens. This is tedious and expensive. And you guessed it many libre fonts simply dont do it right or have capacity to do it at all.
Thats why there can be quite big quality jump when you compare it to fonts from big design teams from Apple or Microsoft. Not only the font might be a bit worse, the rendering/hinting is often way worse.
Exactly. This is why whenever the release of a new font is being posted here (usually coding fonts), I end up being not interested due to the lack of pixel-level hinting.
Windows has the worst font rendering of all modern operating systems. Wanting anything like Windows font rendering is insane. Windows 10 makes it near impossible to properly turn off subpixel hinting without also turning off all anti-aliasing, which on a QD-OLED screen makes for horrific color fringing. Windows 11 is better, but still pretty weak. Linux is roughly as good as Mac OS, both of which are miles better than Windows.
Mac OS dropped the subpixel garbage (it really is garbage if you're at all sensitive to fringing or use anything other than a standard LCD) in favor of high pixel density screens. Sharp, readable text and zero color fringing. This is the way.
> (it really is garbage if you're at all sensitive to fringing or use anything other than a standard LCD)
Human eyes have higher spatial brightness resolution than spatial color resolution. At the cost of software complexity and mild computation overhead, a screen like a Bayer matrix or technology-appropriate similar subpixel layouts together with software that properly anti-aliases content by clamping brightness and color resolution separately to appropriate values ensuring the screen will remain capable of showing the limit frequency and that the two limits are sufficiently close to not disturb the eyes/viewer, will result in better viewing than if you lazily forcibly clamp the brightness and color resolution to the same value as Apple did.
If you have a non-"default" screen subpixel layout then you need to remain able to drive each subpixel individually from the computer and to have the antialiasing algorithm be aware of the specific arrangement you have.
And no, until you can point me to a sub-2000$ (and at that price and that poor contrast, a minimum of 120 Hz) 35~55" screen with at least 2500:1 static contrast, a vaguely 16:9 aspect ratio (though I'll accept 4:3 with the same pixel count and density and accordingly scaled dimensions), and at least 10k individually addressed (and anti-aliased onto by the font rendering) horizontal pixels, I'll happily stay with my 11520 horizontal (sub-)pixels that I paid about 700$ for (43", 5000:1 static contrast, 60Hz).
With OLEDs with funky pixel layouts starting to become more popular, I hope Windows starts making their system less crap...
I suppose this is a subjective area. I would rank Windows on top, Mac as a close second, and Linux ... well, I love Linux for reasons other than UI.
While the font rendering method matters, the differences between operating systems are typically much smaller than the differences in quality between typefaces.
Linux had a very bad appearance in the past with a default configuration, and it still does not look good in most distributions, but that is not due to bad rendering algorithms, but it is due to the fact that the default free typefaces are usually not very good.
For several decades, the first thing that I have always done after installing Linux was to delete all default typefaces and replace them with some high-quality typefaces, most of which I have bought, with a couple taken from a Mac OS and a Windows that I had bought in the past (which I have stopped using many years ago, except for the few typefaces that I have kept from them).
Because of this policy, any text on my Linux computers has always looked much better than on any of the Windows or Mac OS computers that I have used at work.
it's not subjective if you use OLED screen
I have an OLED desktop monitor and have the same preference order as OP
Yeah I don't understand this difference of opinion here - Linux looks fine to me, Mac looks pretty and Windows looks like it's been driven over a few times.
- [deleted]
Heavily disagree as a longtime Linux user. I don't know about MacOS but Windows has always had better font rendering than Linux in my experience.
I feel exactly the same, the font rendering in Linux drives me absolutely insane. The amount of hours I've spent over the past 15 years tweaking fontconfig, having to compile special patched packages etc etc doesn't bare thinking about.
I don't even have to edit anything on Windows now, and when I did a few years ago it was only on a clean install going through I think what was called cleartype config, you had a panel of 6 images/samples to choose from and after going through it all everything looked pretty damn perfect.
I'm currently using a Mac for private stuff, used to use Linux for work stuff, currently forced to use Windows. My font rendereing ranking is MacOS > Linux > Windows.
I rate it macOS > Windows > Linux. iOS is pretty good too, mobile Windows wasn't. But I'm only experiencing Linux graphically on a rather old monitor or through a terminal emulator, and macOS and Windows on a nice monitor, so that probably skews my perception. I wonder how many people observe the three OS'es through the same (or very similar) monitors.
> in favor of high pixel density screens
I wish I could download an OS update that gave me a high pixel screen! But, uhhh, that’s not how it works.
Do yourself a favour and go out and purchase a high pixel screen. I don't understand why people who spend a good part of their days in front of a computer (or employ others to do it), but refuse to get quality hardware. Is it worth destroying your eyes, posture, and joints to save this money? Not to speak of just the personal pleasure of using good hardware.
Restaurants spend tens or hundreds of thousands of dollars on equipment so that their staff can work more efficiently. Why doesn't IT people take inspiration from them and get a bit better equipment. It's not a luxury item.
> Is it worth destroying your eyes, posture, and joints to save this money?
Not high enough pixel density causes which one of these again?
> Why doesn't IT people take inspiration from them and get a bit better equipment.
Are we talking about the same field where people are having a competition about who can build the most overpriced and obnoxious sounding keyboards imaginable?
Bad screens and bad illumination is bad for your eyes and bad for your sleep. And it's simply not as nice as screens with full pixel density.
Some IT people and computing enthusiasts are total consumers, I don't disagree. But it's also the only profession / serious hobby where it's common for people who are doing it all day insist on having the cheapest equipment, or half broken hardware. Not in the case with the commenter above, who I misunderstood in some way.
I’m currently running a 4K 32” monitor. What exact monitor do you recommend I buy for my Windows desktop?
- [deleted]
Then why are you saying you want a high pixel screen, if you already have it?
4K @ 32” is not particularly high density. It’s waaaay lower density than a phone or MacBook.
Then buy a high pixel density screen if you want to have one. They're really good for working with text and other computing related stuff, and they're not expensive.
Philips have a very good one at 27 inches.
> exact monitor
> 27 inches
bwahaahaha, and you are talking about eye strain. Number one thing I do if I want to improve QoL for my eyes is getting a larger monitor so I could see more without looking like the last guy from the famous human evolution culminating in sitting in front of computer pic.
[flagged]
> The only way for me not to accuse you of just straight up lying is to just wrap my thoughts into the cushion of subjectivity.
This was unnecessary.
I really don't mean it as an insult, it legitimately just comes off like people are lying. The difference in experience is really just that stark that I honestly cannot fathom these claims actually being truthful, certainly not in the absolute sense they are presented. So, subjectivity remains.
I can't really agree with this at all. I am a very design-heavy person who has been using a Mac professionally since the x86 transition, and have very strong opinions about font rendering, color accuracy, etc. About 2 years ago, I built a beast of a Linux workstation and use it with a 5K Apple Studio Display. Everything looks flawless and pixel perfect.
Yeah, I'm happy with my Linux font rendering for sure, and I would say generally it's more Apple-esque, so some reasonably large degree of this is just opinion and preference.
There was definitely a time when, to get good results, you had to do a lot more tweaking, setting things in fontconfig, using patched Freetype, but I don't really experience that anymore now for quite some time. I do still bring around a fonts.conf to my machines basically out of habit... the only relevant thing it does now probably is disable embedded bitmaps.
It's always a little bit of whiplash seeing how different this very site looks when I occasionally am on a Windows machine, with Verdana rendering quite differently there, and to my mind, worse.
Could be that it changed in Linux a lot in the last few years. The article is from 2018.
One thing that used to be possible with Freetype was configuring how "heavy" hinting was: I remember the time when autohinted fonts looked the best with "light" hinting. They were smooth, non-bold and I couldn't see colour fringing either.
You could also set the RGBA vs whatever pixel layout in the same Gnome settings dialog. Easy-peasy adjustment for a VA panel.
After, it was available only in gconf/dconf or a tool like gnome-tweaks or similar.
MacOS is definitely terrible today, but I prefer Linux over Windows still.
I think Linux font rendering looks fine (although it has noticeably gotten better since this post was last updated in 2019) but I absolutely agree that MacOS has the worst looking font rendering. And I was using it on a genuine MacBook Pro! Discussions otherwise have convinced my that apparently font rendering just isn't objective but is opinion based
I haven't used displays with under ~215ppi in over 10 years. I find these subpixel opinion discussions still ongoing very... quaint. :)
So you haven't used a 32 inch 4K monitor which is ~135 ppi? What do you get at that size, a 5K or 6K monitor? Not many of those available and they have specific requirements like higher display port or thunderbolt bandwidth.
There's also an entire world of users still on 720p and 1080p displays. They deserve better font rendering even if it doesn't affect us personally.
I haven't. I have 5K and 6K monitors. That's indeed privileged, but only for a bit until that's soon commonplace and cheap. So this all sounds like a very temporary problem at this point for something so subjective. Every time this topic comes up it's the same "but I like the look of $OS rendering the best" comments.
High PPI screens have been around for 10 years or so, and they still cost about twice as much as a standard PPI screen the same size.
Put yourself in the shoes of the average computer purchaser: Would you rather buy a high PPI monitor, or two standard PPI monitors? To me this is a no-brainer.
Maybe you can afford such a display. But I still like regular HD displays because they are cheap and functional.
Almost MacOS-tier font rendering, for free:
Probably only good in high DPI monitors though.FREETYPE_PROPERTIES="cff:no-stem-darkening=0 autofitter:no-stem-darkening=0"
Looking ar the comments it seems that it is very subjective. People seem to prefer what they are used to the most.
Fits with me - long time mac user i like Mac rendering, linux feels very similar and i like it. Windows feels like somebody is burning the fonts into lcd. It is probably more legible in tiny sizes on low pixel screens but it is too strong and not very elegant everywhere else.
> The traditional way of achieving this is through installing ttf-mscorefonts-installer or msttcorefonts. The msttcorefonts package looks like some Shenzhen basement knockoff that renders poorly and doesn’t support Unicode. I suspect that these fonts have gone through multiple iterations in every Windows release and that the versions available through the repositories must be from around Windows 95 days.
This is because these font files originate from a Microsoft initiative called "Core Fonts for the web" that ran between 1996 and 2002. Before web fonts became a thing, Microsoft wanted to make a set of broadly available fonts that web designers could assume everyone had on their computers. Because Microsoft cancelled the initiative, the redistributable versions of those fonts are stuck in time. They were last updated around 2000, and any updated versions with further improvements or added characters aren't freely redistributable.
This is OK for me because I use these with full (or medium) hinting and anti-aliasing off in some apps, and greyscale anti-aliasing in other apps with the v35 interpreter.
v40 with slight hinting and greyscale or subpixel works, but I don't tend to use the fonts that are meant to be used with slight hinting and later fonts can't handle anti-aliasing off at all.
You can easily do this on a per-app basis with flatpaks.. you can set an environment variable with flatseal, and you can drop a fontconfig folder with a custom fonts.conf in the flatpak's var directory.
What’s wrong with the v35 freetype picture? He writes like it is immediately obvious, but it seems fine.
See the jump from 17pt to 18pt? That's wrong. (Also, the small sizes are just completely obliterated IMO.) Font outlines are scalable; they should have the same relative weight no matter what pt/px size you render them at, and they should have the same proportions. Non-scalable rendering is incorrect (although techniques like hinting and gridfitting do intentionally sacrifice scalability for better legibility, but I argue you can do better in most cases.)
Rendering of vector fonts to fixed grid of pixels leads to incorrect results in principle. Introducing blur where there is a sharp edge in vector data is also a wrong result. You can just choose which kind of wrongs is more annoying - whether distortion due to grid-fitting or blur due to naive rendering and antialiasing.
There is no objectively "best" way to render vector typefaces to a raster, but that's not because all of the options are equally correct, it's because options that are more accurate to a font might look subjectively worse. There's nothing "incorrect" that a raster rendering of a shape can't convey the signal with perfect fidelity, but that doesn't mean that all rendering of vectors to raster is equally correct.
Like fine, let's put aside somewhat intentional things like hinting and grid-fitting with accumulating error for a minute. Some FreeType configurations dramatically fuck up the visual weight of fonts, making the regular style in a type face look fairly bold. The damn font looks wrong. It's not "wrong" as in I disagree with what the designers intended for the type face, it's wrong as in it looks nothing like the designers intended and it looks nothing like the parameters you put in to render the font. There is basically no perspective where this output is desired, it's just a bad rendering.
There's definitely a bit of subjectivity in exactly where to draw the line, but there is definitely still a line you can cross that just goes into blatantly wrong territory. The relative visual weight of a glyph is not supposed to be influenced by its size on screen.
Who cares? That only matters if you have a bizarre document that is incrementing through all the font sizes.
Well, because it literally distorts the glyphs and thus doesn't actually look right, it would be like if some of the pixels on your screen were inexplicably the wrong color due to a color management issue. In some cases the distortion is really bad and doesn't even really improve legibility at all, so it's just a plain lose/lose. If you don't give a shit about typography in the least and don't care about the visual weight of text then fine, but not caring doesn't mean the behavior is correct by any means. (And keep in mind, you will often have more than one font size of text on screen at once, so this distortion will change the relative weight of fonts incorrectly, aside from also distorting the actual shape of glyphs.)
But OK, other than just being incorrect, does it matter? Many people don't have proper color management in their software and it's usually fine. Well, yes, sometimes it matters. For one thing, this issue really screwed up scaling in Win32 and even GTK+2, because if you tried to render dialogs with different font sizes it would completely change the UI and screw up some of the component sizing. OK, though, you can fix that by just not using a fixed layout. However, you still run into this problem if you want to render something that actually does have a specific layout. The most obvious example of how this can be a serious problem is something like Microsoft Word that is meant to give you a WYSIWYG view of a document on paper, but the paper is 300+ DPI and the poor screen is only 96 DPI.
Maybe most importantly, this is all pointless! We don't actually have to settle for these concessions for Latin script text on 96 DPI screens. Seriously, we really don't. I recommend this (old) article for a dive into the problems caused by non-scalable font rendering and how it could've probably been solved all along:
https://agg.sourceforge.net/antigrain.com/research/font_rast...
(Though to be fair, there are still problems with the approach of vertical-only hinting, as it does cause distortion too.)
the art of drawing pixels generally appears to elude free software. its always kind of sucked. if you're talking about compute shaders, its ok. but the moment it hits the screen, ouch!
Rendering can become extremely nuanced and finicky. Sometimes it can be solved with better, harder to implement algorithms and sometimes it requires tradeoffs that lead to good design. All of this needs time and time is a resource that is scarce in open source.
Related. Others?
The sad state of font rendering on Linux - https://news.ycombinator.com/item?id=41812358 - Oct 2024 (18 comments)
The sad state of font rendering on Linux - https://news.ycombinator.com/item?id=19312404 - March 2019 (167 comments)
- [deleted]
I prefer to use non-Unicode bitmap fonts on Linux. It works fine in programs that support them; unfortunately many programs don't support them in all contexts (in some cases, bitmap fonts work in some places but not others). When I write my own programs, I try to ensure that non-Unicode bitmap fonts work.
Font rendering isn't the bottle neck on Linux, imo it's the terrible default font choices of distro and desktop environment makers.
I still don't know why MacOS can't render sharp text on 1440p external monitor.
Because Apple doesn't sell 1440p monitors and they don't care about Non-Apple hardware.
Well see Apple doesn't sell 1440p external monitors, so the answer is you should stop using that trash and move to one that brute forces the sharpness problem and costs way too much because it has an Apple logo on it
Cute, but irrelevant. I just searched Amazon for "4k monitor 27 inch" and there are any number of highly rated screens from recognizable brands for around $200.
You don't have to splurge to get a 2160p monitor that a Mac will love.
Those monitors kind of suck. You can only get $200 if it's a crappy VA panel at 60hz. My 1440p monitor is a lot better, but apparently not supported by Mac for ideal rendering.
- [deleted]
Because they want to sell you a Retina(tm) monitor. I wish I was making a joke.
Roll back to a version of OSX that predates Retina, and _all_ of your monitors get the expected Mac-like font rendering, Retina or not. Go to 10.7 or newer, and all monitors are ran using the Retina tuning for font rendering, which makes it very smeary and blurry on normal monitors, but looks great on anything that triggers Retina rendering.
So, what I've been advising to the fewer and fewer Mac owners I know that want multi-monitor: only buy 4k monitors, OSX thinks they're HiDPI and won't fuck over your font rendering. At least, they won't today.
Because they dumped subpixel rendering from their OS.
Because 1440p is not a sharp external monitor.
Ever heard of subpixel rendering? You can have pretty sharp text at even 90 ppi if your OS supports it. MacOS doesn't, probably because they didn't want the complexity of supporting it throughout their compositing/graphics stack, and also likely because Apple doesn't sell any low ppi displays.
this is excellent. I know what they mean on Windows. Because not all linux distros support ClearType-like fonts out of the box.
(2019)
In the era of 4k screens, modern Linux distros have great font rendering and I won't take Windows as an example of "good rendering", unless font distortion because of strong hinting is a metric of quality. It is just atrocious to my eyes.
It's pretty easy to be honest: have a high-enough resolution screen, enable greyscale mode (instead of subpixel), turn off hinting. Usually only the latter has to be changed in the settings, as many Linux users still use 1080p screen that benefit from font hinting.
It's 2025, and the font rendering I can achieve on Linux is best for my purposes: clarity, readability, especially of small fonts. So yes, on a web page, or in a text editor, I do prefer small distortions due to hinting over fuzzy "exact" shapes any day. Text is here to be read, not marveled at.
If I need to do typographic work, I can zoom in enough for this to not matter, or just print a proof on a laser printer; no screen is going to have 1200dpi any time soon anyway.
Windows 10 takes the second place (I have no Win 11 machines around to compare), and macOS is still only usable on retina screens.
https://en.wikipedia.org/wiki/Infinality_bundle took care of that.
Until it got more or less resolved by upstream with modernized components, and made redundant.
I keep hearing people talk about how Linux font rendering is supposedly so bad, but I simply haven't noticed any issues with it since switching from Windows on my home machine over 3 years ago.
> There is not even a hint of any consistency in the rendering either, thickness is all over the place even within a single glyph, with different strokes “sticking” together because of the lack of pixels:
Only the "H" looks even a little bit wrong to me.
> As you can see here, indeed, OS X had sub-pixel anti-aliasing in High Sierra, which provided less boldness and bluriness with somewhat better consistency in glyph thickness. However, colour fringing on High Sierra is rather apparent. Rendering is still rather blurry, closer to the FreeType auto-hinter than to what I would consider an optimal result.
The "H" looks just as wrong with and without this feature to me. Overall the new version without the feature is indeed a little bit "bolder" and "blurrier" - and given that it's white on dark grey, I'm pretty sure I prefer it that way.
> Thickness linearity between font sizes on the second image is fantastic. But compare the overall thickness at standard web font size (16px) to any other option and you will see that this comes at the cost of making everything bold by default:
Only the v35 version looks noticeably "less bold" to me here, although the autohinted version is perhaps a bit more blurry. But it's hard to imagine how "thickness linearity" could ever be accomplished without causing this sort of blurriness.
But maybe I'm just unbothered because I grew up with "luggable" Mac displays and bitmap fonts....
> With current state of Linux, it does not matter which engine you pick. They all are broken in the same way:
I searched the page for matching words (or at least letter combinations) and the actual rendered text isn't showing the same issues for me as in the screen capture.
> By the time I was updating this post in August 2019, Cairo received support for sub-pixel positioning in both xlib and image compositors. This means that GTK will soon have it too, as well as Pango and basically anything that relies on Cairo to render text. I am looking forward to the next Ubuntu LTS and might make a separate post about compiling this into the current Ubuntu LTS.
... Ah, I guess that must have happened, then.
----
On the other hand, GTK has caused me all kinds of problems. The default scrollbar theming is obnoxious, and if you fix it, it still doesn't seem to be consistently applied. Firefox does its own thing unless you look up an obscure about:config setting, and even then it still seems to mess up. GTK offers this really weird default style for window tabs(?) that required complete relearning, and that's with Cinnamon being supposedly designed for maximum Windows-alike-ness. Then there's the continued battle from GNOME to try to deny proper notifications and/or a system tray to everyone else. And don't even get me started on the file chooser.
@dang, post is from 2018, so adding (2018) to the title may help as the current state of font rendering on Linux is pretty fine.
Added, thanks!