Hacker News new | comments | show | ask | jobs | submit login

MacOS Mojave removes subpixel anti-aliasing, making non-retina displays blurry (reddit.com) 215 points by laurentdc 6 hours ago | hide | past | web | favorite | 188 comments

add comment

ridiculous_fish 2 hours ago [-] ex-MacOS SWE here. Subpixel antialiasing is obnoxious to implement. It requires threading physical geometry up through multiple graphics layers, geometry which is screen-dependent (think multi-monitor). It multiplies your glyph caches: glyph * subpixel offset. It requires knowing your foreground and background colors at render time, which is an unnatural requirement when you want to do GPU-accelerated compositing. There's tons of ways to fall off of the subpixel antialiased quality path, and there's weird graphical artifacts when switching from static to animated text, or the other way. What a pain! Nevertheless there's no denying that subpixel-AA text looks better on 1x displays. Everyone notices when it's not working, and macOS will look worse without it (on 1x displays). reply

omarforgotpwd 0 minutes ago [-] "Listen guys the code for this is a mess. Can you just buy the new Macbook?" (love my latest gen Macbook Pro, but this is the classic problem with Apple's approach to product design)

reply

captainmuon 1 hour ago [-] Personally, I think that is moving in the wrong direction. This is the reason text looks bad on transformed HTML elements in browsers, or when you click a button on Windows and it tilts slightly. Text rendering should be aware of the (sub), even post-transform. While we're at it, this can apply to vector art, too. If we ever want to have a hope of computers looking like, e.g. this fluent design mockup [1], which is clearly window-based, but the windows are tilted slightly and shaded giving it a physical appearance, we'd have to go in that direction. If we stay in the model where everything is a texture that gets composited, the only thing we can do is to use higher resolution / retina display. Also, one thing I've never seen discussed are virtual displays inside a VR world. The VR headset has plenty of pixels, but you can't simulate a full HD screen at a few feet distance, let alone a retina screen. [1] https://cdn0.tnwcdn.com/wp-content/blogs.dir/1/files/2017/05...

reply

akvadrako 1 minute ago [-] Obviously the right direction is replacing sub-pixels with pixels. Maybe you meant this is premature?

reply

SCdF 47 minutes ago [-] > Subpixel antialiasing is obnoxious to implement Totally, but > there's no denying that subpixel-AA text looks better on 1x displays. I get that it's hard, but at the end of the day what users are going to notice is, when they dock their laptop at work, their display is going to look like fuzzy crap. Maybe most people won't notice? It's hard to tell without seeing a side-by-side comparison.

reply

eloisant 34 minutes ago [-] I guess Apple just wants everyone to buy retina displays, and don't care about people using what they believe is "legacy" hardware.

reply

ksec 3 minutes ago [-] This wouldn't be a problem, if 1) They had Retina MacBook Air 2) They actually have a Monitor Line, that is Retina+ Display, Super great colour accuracy etc. For 1st point Apple doesn't seems to care, and 2nd point, Apple should at least have a list of Monitor that work best or up to their "Standards". Or on sale at Apple Store. These attention to details, in its ecosystem weren't as obvious when Steve were here. But recently they have started to show cracks everywhere.

reply

akvadrako 0 minutes ago [-] This is a good indication they will remove the Air from the lineup this fall and replace it with a lower-end MacBook.

trumpeta 2 hours ago [-] I have a 2015 rMBP with nVidia 750m and recently I switched from Apple Cinema Display to a 4k screen. At any of the graphical modes between the full 2x retina and 1x the whole OS gets extremely sluggish to the point of unusable. Some people from Jetbrains have looked into it and concluded it was the font rendering in IDEA bringing down the entire system. Do you think this change can improve this? Because it certainly sounds like it.

reply

Arkanta 56 minutes ago [-] Sorry for the rant, but given the price of the JetBrains Toolbox, it infuriates me that they've been giving us HDPI users the middle finger for a couple of years now. First of all, switching to grayscale helps A LOT. mojave does this system wide, but the JVM (or IntelliJ, I don't really remember) has its own font rendering system, so you need to change that in IntelliJ directly. I was hurt terribly by this issue. My problems have been solved by three things: - IDEA update that reworks the statusbar undertermined progress bar, which maxed my cpu just for a dumb anymation - Going from an iGPU to dGPU mac (you seem to have one, so congrats, it would be way worse with only your iGPU) - Using an experimental JVM from this thread: https://youtrack.jetbrains.com/issue/IDEA-144261 That said, JetBrains messed up with their 2018.1 release again. I'm mostly using Android studio, which lags behind IntelliJ's versions. 3.1 merged the stupid progressbar fix, but I'm not looking forward when 3.2 hits stable, with 2018.1 merged in. Bottom line is that there are multiple factors in JetBrain's custom OpenGL UI inefficiencies combined with Oracle's JVM bugs/inefficiencies that takes down the whole system, but it's ultimately not only a font smoothing issue. JetBrains doesn't really care, or at least doesn't show that they care. They first said "we don't notice that", and then released some JVM patches in the comments of an issue. Of course, Oracle also shares part of the blame: none of this would have happened had they kept Apple's JVM Quartz support.

reply

davnn 2 hours ago [-] > Subpixel antialiasing is obnoxious to implement. Isn't it completely abstracted away from 99%+ of macOS development?

reply

jacobolus 2 hours ago [-] Yes, but if it dramatically increases complexity for all the lower levels of the rendering stack, it ends up creating extra work for those OS/framework developers, requiring separate versions of the code targeting different displays which must be maintained separately with different sets of bugs in each, preventing them from making large-scale architectural changes they want to make, restricting some types of significant optimizations, etc.

reply

sova 1 hour ago [-] Yeah but isn't it already a solved problem with 1x monitors? Who is implementing things multiple times for no reason? I'm glad in the future it will be easier for Apple because they just won't hire devs to do that part then, but does this imply that 1x monitors will simply look inferior no matter what is done with newer versions of OSX+?

reply

systoll 9 minutes ago [-] There's a broader change that is an improvement to the abstraction devs work with, which simplifies coding and results in more GPU acceleration. Essentially, this article: http://darknoon.com/2011/02/07/mac-ui-in-the-age-of-ios/ will be obsolete. If it's possible to realise those benefits while retaining subpixel antialiasing, it would entail a complete reimplementation. On the other side of things -- Light-on-dark subpixel antialiasing has always been a little wonky, and Apple disables it in most dark UIs. Without significant changes, dark mode would've been greyscale either way,

reply

rbanffy 1 hour ago [-] This complexity is why we can't have windows that spread between 1x and 2x monitors since the introduction of Retina displays. Ideally everything should be rendered (and cached) to the maximum resolution available (minimum trace widths considered) and then downsampled to the specific screens with subpixel data applied, but that's a lot of processing to do and battery life would suffer.

reply

brigade 38 minutes ago [-] Eh? Before Yosemite was released in 2014, one window could span between a 1x and 2x monitor. I forget if it was always rendered at 1x, or if it was rendered at 2x and scaled for the 1x, but yeah it looked like crap on one or the other. Actually I think it switched depending on which monitor had more of the window? But they only eliminated windows spanning across multiple monitors when they made spaces per-display.

reply

jacobolus 1 hour ago [-] It’s already a solved problem if you want to keep using the software you already wrote, and never do anything different. (Nobody is stopping people from running OS X 10.13 indefinitely on their 1x displays.) If you want to rearchitect your whole text / vector graphics rendering pipeline to take proper advantage of GPUs, then you are going to be much happier with your life if you don’t have to double your work to support legacy hardware. I would love to see Apple implement this kind of thing http://w3.impa.br/~diego/projects/GanEtAl14/

reply

tobr 2 hours ago [-] This and the document @gnachman posted has me intrigued. Are you saying that subpixel AA is not just like a multichannel mask? I would expect the amount of R/G/B foreground and background color for a given pixel to be the same regardless of what those colors actually are?

reply

jgh 1 hour ago [-] (note I haven't worked on subpixel AA for font rendering, but have for something different) the amount of color you apply to each subpixel has different levels of intensity depending on how deeply into the pixel the shape goes, so you would need to blend with whatever is behind/in front of it. Also you can't just do something like write (128,0,0) to a pixel and leave it at that, you'll have a red pixel. It would need to be blended with what's already there.

reply

hn0 1 hour ago [-] I expect this would be the case with alpha compositing at least... Afaict subpixel rendering is supersampling width with multiples of 3, then modulo via the horizontal coordinate into rgb triplets and divide by supersampling factor. I guess if kerning and position are also subpixel it might become more tricky to cache results efficiently.

reply

xenadu02 1 hour ago [-] Subpixel-AA was not removed from macOS Mojave so the title is completely wrong anyway. This was covered at WWDC. It uses grayscale-AA now which is much easier to hardware accelerate and gives a more consistent experience across different types of displays.

reply

systoll 55 minutes ago [-] If it's greyscale, it's not subpixel antialiasing.

reply

iamaelephant 1 hour ago [-] Oh gee we didn't realise it was haaaaard.

reply gnachman 3 hours ago [-] I released a build of iTerm2 last year that accidentally disabled subpixel AA and everyone flipped out. People definitely notice. But the reason I’m sad is that this wonderful hack was short lived: https://docs.google.com/document/d/1vfBq6vg409Zky-IQ7ne-Yy7o... My pet theory is that macOS is going to pull in a bunch of iOS code and iOS has never had subpixel AA. reply

saagarjha 23 minutes ago [-] Informative as always, George. > I released a build of iTerm2 last year that accidentally disabled subpixel AA and everyone flipped out. People definitely notice. On a Retina display at least I find that the most striking difference is that most text appears to have a lighter weight. Maybe that's the difference people are perceiving, rather than color fringing or whatever?

reply

1897235235 1 hour ago [-] They give justification for it in the video linked somewhere else in the comments. They say "it works better on a wider variety of displays" in reference to the gray- scale approach. I am guessing they are probably switching to across all of their products.

reply mortenjorck 4 hours ago [-] This is the software equivalent to moving exclusively to USB-C. 18 months after the 2016 MacBook Pro, the vast majority of hardware currently in use still requires a dongle, and 18 months from Mojave's release, the vast majority of external monitors in use will still look blurry without subpixel AA. Someone at Apple really seems to believe in pain as a motivator. And it may work in the short term, but in the long term, it always drives customers to other platforms. reply

TeMPOraL 3 hours ago [-] > Someone at Apple really seems to believe in pain as a motivator. I think they simply believe they're large enough player to make such calls unilaterally, and that the rest of the industry will follow. And I fear they might be right - such decisions enter into the calculations of more "commoditized" vendors like peripheral manufacturers when they design next iterations of their products. I mean, that's why removing the headphone jack is such a big deal even for people not using iPhones. Not just because we sympathize with iPhone owners - but because Apple is big enough to make this stupidity a new standard everywhere.

reply

melling 3 hours ago [-] USB-C still hasn’t arrived as the definitive standard. Apple isn’t quite big enough to force an industry change in personal computers.

reply

sjwright 1 hour ago [-] Actually Apple is definitely big enough to force change—they've done it multiple times in the past. Apple helped push the original USB standard into common use with the original iMac. Obviously it would have happened eventually, but the iMac made the transition occur much faster than it would have otherwise. The problem with the USB-C standard is that USB-A now transcends far beyond the personal computer industry. It's now a power adapter standard for mobile phones and all sorts of gizmos, gadgets and accessories for the home and car. USB-A isn't going anywhere for a long time.

reply

chrischen 2 hours ago [-] If you want to use usb 3.1 gen 2 speeds, you pretty much have to use Type C. The only other alternative is a full usb type a connector or the awkward micro b connector (which is rate).

reply

tinus_hn 2 hours ago [-] USB-C clearly is the way forward.

reply

carolosf 2 hours ago [-] Recently switched to Manjaro Deepin operating system. Everything looks nice and quite polished once inside the OS. I have a great pull down terminal. I can snap windows side by side. Almost everything is configurable without digging into configuration files. It's a rolling release so you don't get stuck on a particular version of a package until the next major release. You get the latest stable release of all software monthly or sooner using the built in package manager. Most of my development tools I install using the package manager. Git, docker, docker-compose, vitualbox, openjdk, node, go, python, (dont use pip use pacman to install python packages), (sdkman for kotlin, gradle and maven), visual studio code, postman,(dbeaver as gui for SQL, Cassandra, mongo, big table, redis, neo4j). Download and install intellij and sublime. WPS office is better than Microsoft Office in my opinion. There's also the AUR repository which are community built packages that can also be enabled in the package manager settings. Everything is really stable. Built in screen recorder and screenshots with annotations. You can map your keys to work like a Mac or a Windows computer even on a Mac. The latest Gimp is actually really good (not yet available on ). There's also Krita for image editing but doesn't handle large PSD files well. Overall the development experience is much less frustrating and docker runs natively which means everything is much faster. I've had a lot of luck running music production DAW and VSTs and games under Wine. There's a great guide on the techonia website about how to dual boot Manjaro Deepin safely on a MacBook Pro. The best part is I can have this environment on any computer so I can have the latest intel processor if I want and the OS won't make my company issued external monitor look like crap. Unfortunately Apple runs more like a fashion company than a technology company. Making people feel compelled to buy the latest product.

reply

sulam 2 hours ago [-] Attempting take this idea seriously, I just catalogued apps I have open right now that I know/suspect have no good equivalent on :

* Safari - might seem unnecessary, but it's a lot of the web * Discord - I hear they have a Linux version in alpha, but...alpha * iTerm - best there is after years, why the hell can't Linux win at this?! * 1Password - guess I use the CLI? ugh. * iTunes - largely used for watching movies * Messages - This is surprisingly nice to have on the desktop * Microsoft Outlook - guess you are forced to use the web client * Xcode - heh

Stuff I didn't expect to see but was pleasantly surprised to see:

* Spotify * Hipchat * Zoom * Skype

I could live without Discord, Messages, Safari, iTunes and Outlook if I got motivated (Outlook I have to have, but the web client is halfway decent). That leaves Xcode, iTerm and 1Password as dealbreakers. We know one of those isn't going to change! I'm of course not including the apps I use less often but really like when I need them, like Lightroom, Photoshop, Illustrator, Keynote, Excel/Numbers, OmniGraffle, Sketch, and Things. I think Linux is safe from me, except as the system I have next to my daily driver, and which I use for Windows (when I need to look at something there) and dual boot for Tensorflow.

reply

zeta0134 1 hour ago [-] Okay, I'll take a stab at this: Safari - either Google Chrome or Chromium (depending on how open you like things) can keep pace with Safari in terms of general usability and extension ecosystem. At worst I think you'd find the experience on par, but when I owned a Mac, I found Google Chrome to be more performant than Safari on the regular, and its Linux build is just as quick. A lot of Linux folks recommend as well but I still am not convinced it beats Chrome in terms of out-of-the-box magical "Just Works" factor. Discord - The client may be alpha, but this works just fine honestly. I've never had a problem. This is largely because the desktop app is Electron based, so it's running on a web browser anyway; it's very, very cross platform friendly. Slack too, if that's your thing. iTerm - There are MANY good terminal clients for Linux. I personally use Terminator, which I find has a good balance of power features and stability. I particularly enjoy its terminal broadcasting implementation, but I'm in the unusual position of working on many parallel servers in tandem during my day job, so this feature is very important to me. iTunes - If it's just for movie watching, I've found VLC to be perfectly servicable. mPlayer is also quite popular and there are many frontends. Messages, Outlook - Here you've got a fair point. Outlook in particular is a pain point; there are workarounds to get it working in Thunderbird and Evolution, but they're just that - workarounds. Anything beyond basic email will need the web app; fortunately the web app isn't _terrible_, but yeah. Fair complaint. Xcode - If your goal is to build Mac / iOS apps, there is no substitute thanks to Apple's EULA. For everything else, pick your poison; there are more code editors and IDEs on Linux than one can count, many of them excellent. Personally I'm happy with Sublime Text (paid, worth every penny) and a Terminator window, but I hear VSCode is also excellent, which is odd considering that's a Microsoft endeavor. (Now if we could just get them to port Outlook...)

reply

saagarjha 17 minutes ago [-] > Google Chrome or Chromium (depending on how open you like things) can keep pace with Safari in terms of general usability and extension ecosystem Google Chrome's extension ecosystem is undoubtably far, far ahead of what Safari has. As for usability, and… > I found Google Chrome to be more performant than Safari on the regular People use Safari because it integrates so well with macOS, is performant, and doesn't kill resources (CPU, RAM, battery, you name it). No other browser comes close, even on other platforms. Apple's just spent too much time here optimizing their browser that nobody else can match it (maybe Edge on Windows?). > There are MANY good terminal clients for Linux. iTerm just has everything and the kitchen sink. Like, it has some flaws, but it just does so many things that I haven't seen any other emulator do. Plus the author is really smart (he's at the top of the comments currently, if you want to check his stuff out) > Xcode Xcode can be surprisingly nice for C/C++ development, when it decides it wants to work.

reply

praseodym 31 minutes ago [-] I recently switched my primary machine to Linux and was pleasantly surprised to learn that 1Password has a web version (which does require a subscription) and the 1Password X extension for Chrome and Firefox to go with that.

reply

carolosf 2 hours ago [-] For passwords I use enpass. It stores and encrypts passwords in your own personal cloud storage account e.g. Dropbox Google drive. Desktop version is free. Mobile version has a once off fee to buy the app. For office I use WPS office (word/excel/power point). It has tabs for documents and I have never had document compatibility issues (like I have had with libre office). I believe office 2013 runs under Wine too if you really need office. I prefer the web clients for email anyway. I use wavebox to integrate web based chat and email with the operating system for notifications. Safari uses WebKit so it's very similar to the engine chrome uses - but Google forked at some point. Deepin Terminal is an amazing terminal. Can't help you with Xcode unless you switch programming languages :) It would be great if Adobe released Linux versions of their products. I don't do much video editing but I think there are good options on Linux. Try it as a challenge I think you might be pleasantly surprised - if you use all the above apps how far Linux has come. Most of these apps work on Mac as well so you may find a few new good tools.

reply

urathai 1 hour ago [-] I have been using Discord on Ubuntu for some time without experiencing any problems.

reply

icebraining 1 hour ago [-] 1Password 4 runs fine on Wine (the Windows version, of course).

reply

noja 1 hour ago [-] Safari and iTunes?!

reply

IloveHN84 2 hours ago [-] The funny thing is that with MacOs, they're NOT a big player at all..

reply

blackhaz 2 hours ago [-] With MacOS, I believe, they still know they're the best. As much as I'd want to move away from Apple because of their recent maneuvers, there's nothing that comes close to MacOS - it just works great. Users will see more appreciation if other Unix-like operating systems will catch up on their GUI. Once a BSD or Linux (I'd really want it to be a BSD!) gets a great UI, I'm switching.

reply

KerrickStaley 2 hours ago [-] GNOME is great, I personally like it as much as macOS. Try running it on Fedora. The latest Ubuntu also ships GNOME by default but I haven't tried it and I'm not sure if it's a tweaked version (if it's tweaked it's surely for the worse).

reply

noja 1 hour ago [-] Gnome really is great. It's very simple though. If you transition to a search based launcher then you won't go back.

reply

carolosf 1 hour ago [-] Try out Deepin - I prefer it to any gui. I hear good things about the latest KDE too.

reply

some_account 1 hour ago [-] First time I heard about Deepin. Looked at a few vids and it looks good. Another sign that a lot of innovation comes from China currently... The Matebook Pro X looks great also.

reply

wingerlang 15 minutes ago [-] I use a USB-C only macbook pro and I absolutely love it, I would not want to go back to a laptop with a lot of IO. The main reason is that when I have to move somewhere, it is only one thing to remove (and then add back in) compared to the 5 minute shuffle I used to do. There have only been one or two situations in a year where it caused a slight annoyance.

reply

mehrdadn 3 hours ago [-] Microsoft has been having similar font rendering issues for years, so I guess Apple felt left out of this club. They spent all this awesome work & research creating ClearType and then when Vista(?) happened they suddenly decided screw everything, let's switch back to grayscale antialiasing for no reason... now everything on the taskbar (or seemingly anything DWM-rendered) is grayscale-smoothed, which makes me want to tear my eyes out.

reply

Jasper_ 3 hours ago [-] > let's switch back to grayscale antialiasing for no reason... The actual reason is that switching to hardware-accelerated font compositing means that you lose out on subpixel antialiasing, because of complications related to blending. The classic way of doing subpixel text rendering means you need three distinct alpha channels, and most hardware only lets you supply just one. There are ways of hacking around this with dual-source blending, but such a feature was not supported at the time of Vista. In Windows 8 and up, DirectWrite can now support hardware-accelerated subpixel AA, and the OS does so by default.

reply

ygra 3 hours ago [-] UWP for the most part doesn't seem to use subpixel AA, though (Edge only does so in the content, but not the window chrome).

reply

0xFFC 3 hours ago [-] And that was the exact reason why I left windows as workstation to Ubuntu. It did torture my eye every time I was looking at UWP app.

reply

fireattack 3 hours ago [-] > then when Vista(?) happened they suddenly decided screw everything What? ClearType was basically introduced in Vista and refined in Win 7. Win 8 is when they start to remove it for various applications (UWP, and some others) due to its poor performance on retatable devices (tablets mainly).

reply

pix64 2 hours ago [-] ClearType definitely existed for XP

reply

Dylan16807 2 hours ago [-] USB-C is great. Putting only a single port on a laptop is stupid. Retina screens are great. Disabling the better font-rendering tech is stupid. And doubling the resolution doesn't make subpixel AA obsolete either.

reply

chmars 2 hours ago [-] I am actually quite happy with USB-C: Charging has become easier since I can has the ports on both sides of my MacBook. Dongles are necessary, however, they have been necessary for as long as I can remember with Macs. Of course, a native HDMI port would still be great …

reply

katbyte 2 hours ago [-] macbook pro users of 8 years here, aside from ethernet/DVI at an office not once have I used a dongle, and tbh those two dongles aren't all that bad because i would just leave them at my desk. I love my magsafe charge + 2 thunder + 2 USB + HDMI ports and can't imagine life with fewer.

reply

chewz 2 hours ago [-] Just install ChromeOS on your Macbooks, forget about macOS and you will be fine.

reply

vesinisa 2 hours ago [-] Or any GNU/Linux distribution, which actually offer a superior developer experience in my opinion. People complaining about their Macs type posts constantly top HN. Usually the comments are filled with further moaning about how things have been so bad for so long with Apple etc., usually accompanied by threats of moving to another platform.

reply

EduardoBautista 7 minutes ago [-] I complain about the Mac from time to time. I just think the alternatives are much worse.

reply

crehn 3 hours ago [-] Not as long as the competition lags behind.

reply

SCdF 4 hours ago [-] I hope we are all somehow misunderstanding how terrible this is. The idea that they would downgrade the display support so that non-retina monitors--- and let's be serious, that is nearly all monitors that people dock into at work or at home-- are going to look worse in Mojave, is almost too absurd to be true. I want to see this for myself. Does anyone know if you can simulate this now, without installing the beta? Or if you can install or somehow use the beta without nuking what you already have? Edit: I am still on Sierra, because the buzz around High Sierra made it sounds like it was a bad idea to upgrade. This is sounding more and more like Mojave is a no-go as well. reply

djsumdog 2 hours ago [-] I fucking hated Lion when I got it on a new MacBook in 2012. I had been running Snow Leopard on my Hackintosh. I decided to buy a new Mac when I went overseas and Lion removed expose and replaced it with that (still) terrible Mission Control garbage. I spent a few weeks trying to downgrade it to 10.6 and it was impossible. Final Cut X was ... something else. Anyone who claims you just need to get use to is must not like having good tools. It's a terrible video editor and Apple killed off their only pro video tool. Around that time I just ran Linux in a VM on my Mac and got into tiling window managers. I tried a bunch and liked a lot of the, but eventually settled on i3 (which I still use today). I ran OpenSUSE at work. Someone at work was throwing out an old IBM dual Xeon and that became my primary box and I used my MacBook as a Windows gaming laptop, up until it was stolen: https://khanism.org/faith/fate-and-destiny/ At my most recent job, I asked for a PC laptop (Dell or HP) and they gave me a terrible MacBook. I run Gentoo on it: https://penguindreams.org/blog/linux-on-a-macbook-pro-14-3/ but I still hate the hardware and it's more difficult to get Linux on a MacBook than any other x86 laptop I've owned, including two Dells, an HP and an MSI. I don't want Apple to go away because we do need competition in the market, but their operating system and hardware seriously needs to stop being a pile or poop.

reply

sjwright 1 hour ago [-] I won't have anything bad said about FCPX. Yes, they botched the launch by releasing an unfinished, beta-quality first version. But every criticism of it was addressed within short order and it's solid as hell now. The only reason why anyone continues to hate it today is because they're used to the terrible, horrible and stupid way that all the other NLEs work (including FCP6). It can be disconcerting to use FCPX because everything seems like a fancier version of iMovie, that it therefore must be not for "pros" or it's missing lots of "pro" features. But it's not. Terrible, horrible and stupid NLE design isn't a feature. FCPX is the first one to get it right, but the rusted on Premiere / Avid "professionals" refuse to see it.

reply

saagarjha 13 minutes ago [-] > Or if you can install or somehow use the beta without nuking what you already have? Yes, I'm running the beta right now. If you're scared about your data, create a new partition or virtual machine and install macOS to that.

reply

spiralganglion 3 hours ago [-] You can simulate this by adding the `-webkit-font-smoothing: antialiased` CSS rule to any website when using Safari or Chrome. Lots of websites already do this though, so you can use the value `subpixel-antialiased` to revert the text to that mode instead. https://developer.mozilla.org/en-US/docs/Web/CSS/font-smooth To really appreciate the difference, I recommend turning on screen zoom with ctrl-scroll in the accessibility system prefs. You’ll see the effect of SPAA as colouring on the left and right edges of text.

reply

drinchev 2 hours ago [-] Wow that's terrible. Would be really sad. Even on my non-retina MacBook's display it reduces drastically the readability.

reply

speeq 40 minutes ago [-] I tried the beta when it came out and immediately downgraded because of the font, it looked worse even on the 5k iMac display.

reply

quipper 4 hours ago [-] You can simulate this by going to System Preferences-->General-->Use LCD Font Smoothing When Available and unchecking it. It also makes Retina-class devices look far worse. The fonts on my 5K iMac became much less readable after this change.

reply

saagarjha 14 minutes ago [-] This turns off smoothing across pixels, rather than subpixel antialiasing. Not the same thing.

reply

zaroth 4 hours ago [-] I do not think this is an accurate way to simulate the issue reported. This simulation will look worse than the actual issue, which is not present on 4K+ displays.

reply

comex 3 hours ago [-] Yeah – on Mojave that checkbox still exists and unchecking it still makes fonts look different (less bold).

reply

pmontra 3 hours ago [-] > that is nearly all monitors that people dock into at work or at home And 1080p projectors in conference rooms.

reply

nxc18 2 hours ago [-] That would be a dream. So many are still on 720p or lower, especially in schools etc.

reply

garmaine 4 hours ago [-] The quality of sub-pixel AA depends A LOT on the physical parameters of the display. It's basically the OS saying "I know that pixels aren't actually square, and I'll use that knowledge to better digitize the font's curves." That worked fine when everything was CRTs or the same technology of LCD displays. However the faults of sup-pixel AA show through pretty strongly on the variety of modern displays (OLED vs AMOLED vs. ...), and the rotation capability of tablet displays. Apple is in the process of switching LED technologies across their products. That said, if there's anyone in the position to exploit their knowledge of the ACTUAL physical display and its sub-pixel layout, it's Apple. I'd expect Windows or Android to drop sub-pixel AA... but iOS/macOS? I'm mystified.

reply

blt 2 hours ago [-] Subpixel AA was never about CRTs; they do not have a consistent mapping between logical pixels and physical rgb holes in the shadow mask. It was always about LCDs

reply

garmaine 51 minutes ago [-] Having actually been in the graphics industry before LCDs were prevalent, we did sub-pixel rendering. It just wasn't built in as a standard feature in OS font rendering until LCDs made it necessary.

reply

iforgotpassword 3 hours ago [-] I think Windows has a feature to configure the subpixel arrangement, what about MacOS? Sure they know about the internal screen, but not the one you plug in.

reply

ygra 3 hours ago [-] Don't screens tell the OS about their physical characteristics? At least the native resolution and the physical screen size (or dpi) are available to the OS somehow. I'd guess the sub-pixel arrangement might also be.

reply

iforgotpassword 19 minutes ago [-] That's EDID you're talking about and AFAIK, it doesn't support that. On my android now but if anyone wants to fiddle around with what their screen provides, on Linux there is read-edid and parse-edid to query and interpret the raw data directly. It's not terribly much since it's 256 bytes max iirc.

reply

pasquinelli 3 hours ago [-] i thought that subpixel aa smoothed by using the individual channels (r, g, b) of a pixel, not by taking into account the physical shape of a pixel.

reply

garmaine 50 minutes ago [-] Look at your screen with a magnifying glass. Then look at your phone's screen. And your TV screen.

reply

tzahola 3 hours ago [-] Yes. But how do you know if it’s RGB or BGR or GBR etc? Sometimes there’s a fourth subpixel for white too. On Windows there used to be a wizard where you had to choose which one looks better to let the system figure out your display’s subpixel arrangement.

reply

nickflood 1 hour ago [-] The wizard is still there in the latest Win10

reply wodenokoto 3 hours ago [-] One guys dell monitor doesn't - subjectively - render input from a beta OS as pretty as it did the non-beta, and suddenly everybody is bringing out the pitchforks. If you read through the comments, you'll find this link buried, which shows a developer showcasing the difference in the new AA. Basically font-smoothing is grayscale-only in Mojave. * https://developer.apple.com/videos/play/wwdc2018/209/?time=1... If I take a screenshot of some text and zoom in, I can see the colours, just like in the video. If I zoom back to normal size and change the saturation to zero (only greyscale) I cannot see a change in sharpness of the text on my 2009 macbook pro. reply

saagarjha 10 minutes ago [-] I'm on the Mojave beta and the difference is noticeable–it's just not the end of the world as many here are complaining.

reply

1897235235 1 hour ago [-] >Basically font-smoothing is grayscale-only in Mojave. This is actually better for me. I wrote to Steve Jobs a long time ago and asked him if he could switch to gray-scale font smoothing because the tricks they use with colours don't actually work with people like me who are red green colour blind. The end result was that text on apple products looked terrible to me and I couldn't use any of their products until the retina displays came out. In windows, you can use grayscale font smoothing easily. Anyway, he replied "we don't have any plans to do this" or something like that. Turns out I won in the end.

reply ghusbands 5 hours ago [-] Subpixel AA only works when you know exactly what the output pixel layout is, when you aren't being scaled by some other technology and when you aren't showing the same thing on disparate displays, and it requires you to adapt whenever the screen rotates, which is common on tablet-style devices. Also, some people (like me) can see the discoloration it imbues. So I'll be glad to see it go. (Attempts to disable it tended to get various results in various operating systems, but rarely consistently disabled it) reply

Demiurge 4 hours ago [-] Really? Somehow that 'only' has included every LCD monitor I've used for more than a decade. Windows AA is more aggressive, but it is still a huge improvement over the jagged over-hinted font rendering you get without ClearType. No AA makes sense for DPI greater than about 300, but for anything less, with visible pixels, which is probably most non-4K monitors, I would bet most people prefer AA on. In general, people always prefer AA for 3D and all sorts of other rendering, at lower DPI. So, this is a typical Apple decision to abandon anyone who doesn't have the latest device from them.

reply

agildehaus 4 hours ago [-] > So, this is a typical Apple decision to abandon anyone who doesn't have the latest device from them. And those who bought a Macbook Air from an Apple Store yesterday.

reply

erikpukinskis 2 hours ago [-] What about today? Are they updating all the Airs with retina displays?

reply

eric-hu 3 hours ago [-] There's a two week return window for MacBooks.

reply

1897235235 1 hour ago [-] > but it is still a huge improvement over the jagged over-hinted font rendering you get without ClearType. There is no font smoothing at all (the jagged fonts), and then there is grayscale font smoothing, and then there is coloured font smoothing. Apple is switching from coloured to gray-scale, which like OP, is great for me personally as I can see all the colours in the other approach.

reply

badsectoracula 3 hours ago [-] Personally i prefer to disable antialiasing and if possible use bitmap fonts designed for clarity. Sadly almost nothing supports properly bitmap fonts these days (assuming it supports them in the first place - i think only GDI and X11 core does, but those are increasingly ignored by programs) and even when you can find support, finding good bitmap fonts is hard because it looks like most bitmap fonts are converted from outline fonts - but with jaggies. As for what i consider a good bitmap font, in Windows MS Sans (the one you could find in Win9x, it was removed in later versions of Windows and aliased with MS Sans Serif which is an outline font), plain Courier and Fixedsys. On Linux i like the console font that you get when booting Debian (not sure what is called) as well as the Terminus fonts. On X11 i like the helv* fonts (usually registered as adobe helvetica - honestly, why is font registration on Xorg such a clusterfuck?) and some of the Fixed fonts (the smaller ones mainly, the larger are a bit too jaggy). On Mac... pretty much all of the original Macintosh fonts. I also like the "WarpSans" font that OS/2 (and eComStation) has, it is very compact and i like compact fonts (although i'm not fan of a few details, like the w being a bit too jaggy).

reply

chungy 2 hours ago [-] > On Linux i like the console font that you get when booting Debian (not sure what is called) That's probably Unifont: http://unifoundry.com/unifont.html The exact font used is configurable but Debian uses Unifont by default.

reply

pvg 4 hours ago [-] This isn't 'no AA'.

reply

peterkelly 5 hours ago [-] I don't see why they couldn't make it configurable though

reply

ghusbands 5 hours ago [-] They did, but it was inconsistent [1]. Microsoft had the same problem [2]. In both cases, poorly-documented APIs meant many applications still used subpixel antialiasing. [1] https://apple.stackexchange.com/questions/110750/how-do-i-di... [2] https://social.technet.microsoft.com/Forums/en-US/768bd013-c...

reply xucheng 4 hours ago [-] In addition to the MacBook Air and normal non-4K external display, it seems that this will also affect projectors. Therefore, it seems that presentation using macOS Mojave will only make your slides/keynote look bad and less professional. Moreover, the projector is not something you can upgrade. It is usually preinstalled in whatever venue where you give your presentation, and totally outside your control. reply

saagarjha 9 minutes ago [-] I doubt that many people will be able to tell the difference in your presentation if they're sitting twenty feet away.

reply tandav 4 hours ago [-] MacBook Air. PDFs looks very blurry and dirty since High Sierra (Mojave too). In Preview.app and in Safari. You can compare it if open PDF in Chrome (looks sharp in chrome). https://imgur.com/a/TRpk1Oi fonts on some websites looks blurry https://imgur.com/a/o450Mlr reply

peterbecich 3 hours ago [-] Seems similar to this issue with reading PDFs in Emacs (predates High Sierra, though): https://github.com/politza/pdf-tools/issues/51

reply

colinjoy 3 hours ago [-] So are you saying the High Sierra PDF rendering catastrophy is not a bug but a precursor of this „feature“? If that is true and the blurred text is going to be the new normal, I do not see how any professional user looking at type all day long can stay on their platform.

reply

tandav 2 hours ago [-] Every time I open PDF I think about install El Capitan back. One thing that stops me is picture-in-picture mode. I use it heavily on youtube and it is only in Sierra and later.

reply

galad87 2 hours ago [-] El Capitan had the same issue on my mac. It was fixed in High Sierra, and came back in High Sierra.

reply

galad87 2 hours ago [-] No, Preview awful pdf rendering is unrelated to antialiasing.

reply galad87 3 hours ago [-] Here's a comparison, so we can comment on he actual difference: http://www.framecompare.com/image-compare/screenshotcomparis... reply

pygy_ 29 minutes ago [-] Thanks. On my MacBook Air, the Mojave capture looks blockier. I won't update (I'm still on Sierra, actually, I ususally wait one year before updating anyways).

reply

eddieh 3 hours ago [-] Or slide 125 from this PDF: https://devstreaming-cdn.apple.com/videos/wwdc/2018/209pydir... Which isn't a perfect comparison because of image compression, but at least on my retina MB they look identical to me. EDIT: You can also see the side-by-side at 28:03 in this video https://developer.apple.com/videos/play/wwdc2018/209

reply

kevingadd 3 hours ago [-] Unfortunately on my high-dpi monitor I can't get that page to display the screenshots at 1:1, so I can't see the difference due to filtering :/

reply

spiralganglion 3 hours ago [-] Enlarge the page using browser zoom. Subpixel AA will make the text look red on one edge and cyan on the other. Without it, the text is perfectly greyscale. This is visible even with extensive bilinear filtering.

reply

pvg 3 hours ago [-] You aren't going to capture the details of subpixel AA in a screenshot.

reply

spiralganglion 3 hours ago [-] You most certainly can. The effect is applied when shading the text, so it does appear in screenshots, as would any other in-software AA technique (AFAIK). You can test this yourself in the browser. For instance, toggle subpixel AA using the CSS -webkit-font-smoothing property in Safari. There is a clear visible difference that you can screenshot. On some websites with a light-on-dark theme, disabling subpixel AA with CSS was an important part of improving visual consistency between browsers. It’s also important to explicitly control when working with animations that changes text scale or rotation. It’s easier to see with an enlarged screenshot, but you can totally see the difference at 1x.

reply

pvg 3 hours ago [-] I don't know what you can toggle with CSS but I'm pretty sure you can't replicate 'thing rendering a font with knowledge of your physical display' with 'bitmap made of pixels and some CSS'.

reply

spiralganglion 3 hours ago [-] The “knowledge” is applied when the text is rendered to the bitmap, not when the bitmap is pushed to the display. You’d be right if the effect happened quite late in the rendering process. It doesn’t — it happens really early.

reply

rl3 2 hours ago [-] Screenshots aren't very useful for meets-the-eye visual quality comparisons of subpixel AA unless your display has the same pixel arrangement and resolution as the display the screenshot was rendered on. Even then, there's probably other issues. I agree that they work for technical comparisons of rendered output.

reply

pvg 2 hours ago [-] Maybe I'm confused about something then but you seem to be telling me the subpixel AA is positionally independent. I can take the subpixel AA-ed bitmap and plop it in a page, open it on some other browser and OS and be certain everything is the same down to each coloured subpixel. You'd think every, say, vertical line would have the same colour artifacting, if that were the case. But they don't.

reply

blt 2 hours ago [-] It's visible in any Windows screenshot

reply chx 4 hours ago [-] The intent is to provide users with a sense of pride and accomplishment for buying new devices. reply

Hamuko 3 hours ago [-] As long as it's not a monitor.

reply valine 5 hours ago [-] Apple said during WWDC that the change was made to support a wider variety of display technologies and scaling modes. It’s an odd choice considering they still sell the non-retina MacBook Air. reply

wvenable 4 hours ago [-] It's kind of a funny definition of support.

reply

derefr 4 hours ago [-] Forcing app developers to live with the same lowest-common-denominator aesthetics that most users see, will make them more likely to raise the bar for the aesthetics on lowest-common-denominator devices. It's the same reason that video game developers shouldn't be allowed to have powerful GPUs. They need to be testing how the game looks (and works) for most people, not for a group of people only slightly larger than themselves.

reply

infogulch 4 hours ago [-] Do you think most developers are using a non-retina screen? Compared to the general population I bet they skew very heavily towards newer, retina machines. Which means they'll barely notice anything. This is just plain lowering the bar, not raising the average.

reply

jedberg 3 hours ago [-] I'd say most developers are docking with an external screen, and there are very few external screens that are retina. So yeah, I'd say most developers are using a non-retina screen.

reply

cerberusss 3 hours ago [-] > I'd say most developers are using a non-retina screen I don't see this as a problem with 4K screens starting at 260 euros (~ US$ 306).

reply

jedberg 2 hours ago [-] 4k screens mostly run at 60hz, making them terrible for anything but watching movies. I don't like coding on a screen where I can see the refresh as I scroll.

reply

cerberusss 1 hour ago [-] Yes, that is currently a problem for those who want that 120 Hz goodness.

reply

Hamuko 3 hours ago [-] Why in the world would I want to use a 4K screen when I can get a 2560x1440 screen?

reply

Brian_K_White 2 hours ago [-] That's a silly question. More stuff fits in more pixels. Even if you have to compensate for readability by scaling fonts larger to make text the same size as it was on the lower dpi screen, the end result isn't actually equivalent to the lower dpi. It still works out that the higher dpi screen is on average more functional due to more things fitting on screen. Especially on dual/triple 27" external monitors on my desk. They are large and I am close to them, and a mere 1440 vertical pixels would be tolerable but crap when better is available. I usually have 4 to 10 terminal windows open where I want to see as much code and documentation and log output as possible, not a movie.

reply

djsumdog 2 hours ago [-] Hmm, I can't reply to Symbiote for some reason. But yes, 2560x1440 screens and other wide variants are really prefer by gamers or people who want high refresh rates. 4k is limited to 60Hz, unless you use DisplayPort 1.4 and some very new monitors. There are quite a few users who prefer the wider format screens for a variety of reasons.

reply

Symbiote 2 hours ago [-] A 4K screen is 3840 × 2160, why would you not prefer that?

reply

Dylan16807 2 hours ago [-] Because "retina" can imply pixel doubling, making it more like an extra-crisp 1920x1080. The typical 1440p screen is nice and big, and replacing it with such a setup at the same physical size would needlessly cut down on what you can fit. If you could trust non-integer scaling to work properly then 4K would resoundingly beat 1440, but that's a pipe dream right now. You can get the best of both with a 5K-6K screen, but that's currently a very expensive price point not on most people's radar.

reply

djsumdog 2 hours ago [-] If you work as a dev in a company, they're not going to be issuing 4k screens.

reply

saagarjha 6 minutes ago [-] I know of many companies that issue their developers 4K screens, so…

cerberusss 1 hour ago [-] So get your own monitor. I understand that many employees make the decision that they will not pay for equipment, to be true. However it's still a decision.

reply

djsumdog 2 hours ago [-] I don't think most game shops do. Their devs typically are constantly comparing rendering on mid nVidia and AMD cards, as well as Intel integrated. The Indie titles have to do so because they know their games go for $9~$15 and they need to look consistent or at least decent or a lot of off-the-shelf laptops. Triple-A titles can afford larger huge testing teams that take on various Intel/amd/nvidia cards and report consistency issues. Sure if the title has a huge AMD or nvidia logo when it starts, it's going to use a lot of specific functionality from that card, whatever those shops pay them specifically to advertise for. But devs need to ensure titles at least look okay and run at at least 60fps to get the largest sales audience.

reply

JackCh 4 hours ago [-] Are they forcing app developers to not use retina screens?

reply dragonshed 4 hours ago [-] I still use a non-retina 13-inch from 2012, as well as an Ultrawide monitor (3440x1440), both of which I apparently wont be using with macOS Mojave. I can't help but think this is just another in a long line of paper cuts endured by professional users causing them to switch to a Windows machine. reply

zbraniecki 4 hours ago [-] Or you know... Linux. Whatever. It's 2018.

reply glhaynes 5 hours ago [-] This seems certain to be related to integration of UIKit-on-macOS ("Marzipan") apps. (UIKit has never had subpixel antialiasing.) reply

mpweiher 3 hours ago [-] The reason being that it tends to want to push pre-rendered bitmaps around, and sub-pixel anti-aliased pre-rendered bitmaps don't work well under rotation...

reply

xenadu02 1 hour ago [-] No.

reply ghostcluster 4 hours ago [-] High Sierra's Metal rewrite of windowserver still has serious bugs nearly a year after shipping, even on a brand new MacBook Pro. They managed to ship 10.13 with bugs like this that made it into the Ars Technica review: http://cdn.arstechnica.net/wp-content/uploads/2017/09/Sep-23... reply hyko 1 hour ago [-] Subpixel font rendering was a cool hack back in the day, but it’s no match for high dpi monitors. Personally I could never get past the color fringing artefacts. Bill Hill himself talked about it as an interim measure to get people reading on screen. Here he is talking about the original implementation of ClearType: https://channel9.msdn.com/Blogs/TheChannel9Team/Bill-Hill-Ho... Joel Spolsky on the different sub pixel AA approaches taken by Apple and Microsoft: https://www.joelonsoftware.com/2007/06/12/font-smoothing-ant... reply quipper 1 hour ago [-] I just installed the Mojave beta on my Macbook Pro. The fonts are just as blurry as if I toggle off "Use LCD font smoothing when available" in High Sierra. So the contention that it won't be the same in Mojave is disproven. Even on Retina displays, disabling sub-pixel anti-aliasing leads to blurry, indistinct fonts. reply

saagarjha 4 minutes ago [-] Mojave doesn't ship with subpixel antialiasing. The "LCD font smoothing" checkbox disables greyscale smoothing.

reply ivirshup 2 hours ago [-] Am misreading this thread, or was this one person's complaint on reddit about beta software (with a single commenter saying they've seen the same behavior)? Are there any more cases, or could this be a bug? Cause it seems like a bug to me. reply

mpweiher 2 hours ago [-] The change was also announced at WWDC. https://developer.apple.com/videos/play/wwdc2018/209/?time=1...

reply some_account 1 hour ago [-] Apple strikes again....they recently destroyed having two external displays with a recent update, and now they mess up single screen external displays too. I have a MacBook Air connected to a 2560x1440 screen and fonts looks good. I was looking forward to Mojaves dark mode. Now I'm afraid to install Mojave. I understand that Apple wants to simplify their code, but they can't just remove things before having an alternate solution to a problem that exist. Font rendering is really super important... I really hope they reconsider. reply blt 2 hours ago [-] I never upgraded past Mavericks. It seems that the OS has been going downhill. Thinking about installing Ubuntu on my MacBook. reply quipper 4 hours ago [-] Signed up for HN (again) just to vent about this terrible news. This is a change that will be hard for me to live with, and I will probably end up selling all my Apple gear. This is the same change/downgrade that occurs if you go to System Preferences-->General-->Use LCD Font Smoothing When Available. I did this and the fonts on my 5K iMac display looked horrible. Just atrocious. My plan is to not upgrade to Mojave, and then within a year or so sell all my Apple gear and move back to Linux. I don't understand Apple's thinking, but I believe a lot of people will do the same. reply

jen729w 4 hours ago [-] > but I believe a lot of people will do the same No they won’t. People don’t notice this stuff. I pointed out the weird font smoothing issue in Finder to my best mate – a professional photographer – and even then she barely had any idea what I was on about.

reply

sod 1 hour ago [-] They don't remove font smoothing. They switch it from subpixel level to grayscale only. Technically subpixel should produce better results then grayscale. But IMO the subpixel antialiasing in macos looks horrible. So you might not even see the difference as it looks exactly as horrible as before. I'd be more outraged if Windows removed , because their subpixel font antialising actually works and is configurable.

reply

quipper 48 minutes ago [-] Greyscale anti-aliasing is far inferior to sub-pixel. The fonts look much worse. BTW, I did install Mojave beta and the fonts look just terrible even on a Retina display. Shockingly bad. I can't believe Apple is doing this. I just bought a new 5K iMac a few months ago. I wish it were still in the return period...but in other news, I am now selling all of my Apple equipment as it's pointless to have it. Without the great fonts I purchased it for, it's so much junk to me.

reply

pixard 4 hours ago [-] This doesn't affect retina devices so your 5k iMac will look the same it does now...

reply

quipper 4 hours ago [-] It does affect Retina devices. That's the whole point. I made the same change that Apple is going to make and the fonts looked far worse (almost unreadable). You can observe the same if you have an Apple device by going to: System Preferences-->General-->Use LCD Font Smoothing When Available. Apple is removing sub-pixel anti-aliasing for all devices, not just non-Retina ones.

reply

eknkc 3 hours ago [-] Try the beta before venting. It’s not the same thing.

reply

quipper 1 hour ago [-] I installed Mojave beta on my MacBook Pro. It is the same thing. The fonts look identical to toggling off the setting "Use LCD font smoothing when available" in High Sierra. And by the same thing, I mean they look very bad in Mojave.

reply

quipper 3 hours ago [-] Downloading the beta now, but have seen screenshots of both (here and elsewhere).

reply

read_if_gay_ 1 hour ago [-] I've been on the beta for a couple days (using a 2016 12" MacBook) and I haven't noticed anything. There's a visible difference when I disable LCD font smoothing in settings, but not much changes besides fonts appearing somewhat thinner.

reply

galad87 3 hours ago [-] http://www.framecompare.com/image-compare/screenshotcomparis...

reply

quipper 3 hours ago [-] Thanks. The fonts are noticeably blurrier/less defined in the Mojave capture.

reply

galad87 4 hours ago [-] No it’s not the same as disabling lcd font smoothing. Try the public beta if you don’t believe it.

reply

quipper 1 hour ago [-] Mojave beta installed, and the fonts look equally atrocious. They are blurry, indistinct...and just bad. One of the reasons I bought a 5K iMac (three of them, actually) was to have great fonts. I am beyond angry that I will now have to sell them.

reply

galad87 5 minutes ago [-] Can you post some screenshots? I find hard to believe that fonts on a retina display can be blurry and indistinct, when the antialiasing text weight is almost the same.

reply quipper 39 minutes ago [-] In other news, how is Linux's sub-pixel anti-aliasing lately? I haven't used it in years, and never on a 4K or above display. Since I will now be switching off Mac, Linux is looking likely again. It will probably be two 27" 4K monitors. reply

livebsd 4 minutes ago [-] As of "lately" in the sense of "in the last 10 years", it was always great. Freetype always had great rendering, on part and often superior to both ClearType and OSX to my eyes. I attribute the bad perceived performance of to the lack of good and/or commercial fonts. The default configuration in most distributions is decent, but with little tuning everything can be changed to your taste. I ran with grayscale AA since the beginning because I find the color fringing annoying. I also used to like the bytecode hinter, but in the last years with my laptop having 120+ dpi, I find the freetype autohinter to be actually superior as it better preserves the letterform and provides sharper results even without subpixel AA. The settings can also be tailored per-font, and apply system-wide, with the exception of some stupid QML and Electron apps.

reply pibefision 31 minutes ago [-] Could be this a measure to combat hackintosh installed base? reply

saagarjha 3 minutes ago [-] How so?

reply update 4 hours ago [-] So what's the solution here? Buy a new monitor if we want to use up-to-date versions of macOS? Anyone who uses Apple's FCPX or Logic knows we'll have to update to Mojave because eventually we'll try to open the latest version of said software and be told "You must be running Mojave to open this" I'd really ditch Apple because of stunts like this, but I really love Logic and FCPX so I'm stuck with it. reply kuon 1 hour ago [-] That must be personal, but I always preferred grayscale AA. reply

copperx 30 minutes ago [-] What's the last time you got an eye exam?

reply

PakG1 4 hours ago [-] Oh my goodness. And here I was thinking that I'd love the next macOS due to its alleged focus on stability. What the heck.... reply the_mitsuhiko 4 hours ago [-] Not cool at all. I don’t want a high dpi external monitor because it’s too laggy. Very annoying change. reply rbanffy 1 hour ago [-] I'm betting Apple will launch a family of Retina desktop monitors. Also, remember it's still beta software. People do public betas to surface issues like these. reply cJ0th 2 hours ago [-] I so knew it! I even put a reminder in my calendar some time ago (not knowing what the next update holds) to not upgrade until I know for sure that my experience won't get worse. reply hn0 1 hour ago [-] I must be the odd one out in that I actually prefer my fonts to look “soft”? reply frou_dh 3 hours ago [-] Nothing like some frothy sight-unseen outrage on a Saturday morning. reply mark-r 5 hours ago [-] I wonder what panel technologies this is supposed to improve? The only one I can think of is a portrait rotated LCD. reply

ken 4 hours ago [-] PenTile, RGBW Quad, etc -- basically everything except plain old RGB stripes. Of all the possible display layouts, SPAA really only works with one (or it has to be re- designed for each, which realistically will never happen). The stats I've seen are that most Mac users, as of this year, are on @2x displays.

reply

Lio 1 hour ago [-] The real question is whether they’re on @2x disolays exclusively. I have a Retina MacBook Pro I mostly use it plugged into a 27” Dell 1440p monitor. Every office I’ve worked in that provided Macs also provided a 1x monitor of some kind. Never 4k or 5k.

reply

Demiurge 5 hours ago [-] It works (worked :/) just fine when switching to portrait mode. I use a secondary Dell monitor for that.

reply fireattack 3 hours ago [-] It doesn't surprise me, since even Microsoft is slowly dropping ClearType. reply jbverschoor 2 hours ago [-] Since retina I disabled it myself reply

Angostura 3 hours ago [-] It’s a beta. Report it as a bug. reply

Lio 1 hour ago [-] Its’s not a bug. It was announced as a change at WWDC. It’s believed to be a consequence of the move to merge iOS code into macOS. iOS doesn’t support sub- pixel antialiasing.

reply deevolution 5 hours ago [-] Planned obsolescence reply

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: