Posts Tagged ‘browsers’

Unicode, Browsers, Python, and Kvetching

May 28, 2008

My HTML/unicode character utility is now in a reasonably usable state. I ended up devoting rather more effort to it than I had originally planned, especially given that there are other perfectly useful such things out there. But once you start tweaking, it’s hard to stop. There are now many wonderful subtleties there that no one but me will ever notice.

What gave me the most grief was handling characters outside the Basic Multilingual Plane, i.e. those with codes above 0xFFFF. That’s hardly surprising. And I suppose it shouldn’t be surprising that browsers handle them so inconsistently. All the four major browsers try to display BMP characters using whatever fonts are installed, but not so for the higher ones. In detail:

  • Firefox makes a valiant effort to display them, using whatever installed fonts it can find. It’s fairly inconsistent about which ones it uses, though.
  • IE7 and Opera make no effort to find fonts with the appropriate characters. They do work if you specify an appropriate font.
  • Safari (on Windows) doesn’t display them even if you specify a font. This does not further endear Safari to me.

Oh, and on a couple of XP machines I had to reinstall Cambria Math (really useful for, you know, math) to get the browsers to find it. There must be something odd about how the Office 2007 compatibility pack installed its fonts the first time (I assume that’s how they got there).

On the server side, I knew I would have to do some surrogate-pair processing myself, and that didn’t bother me. Finding character names and the like was more annoying. I was delighted with python’s unicodedata library until I started trying to get the supplementary planes to work. The library restricts itself to the BMP, presumably because python unicode strings have 16-bit characters. The reason for the restriction is somewhat obscure to me—the library’s functions could presumably work either with single characters or surrogate pairs; and I’m pretty sure all the data is actually there (the \N{} for string literals works for supplementary-plane characters, for example).

The whole unicode range ought to work in wide builds of python, but I have no idea if that would work with Django and apache/mod_python and Webfaction, and I’m far too lazy to try. So I processed the raw unicode data into my own half-assed extended unicode library, basically just a ginormous dict with a couple of functions to extract what I want (so far just names, categories and things to come if I ever get around to it).

Safari

April 12, 2008

For no particular reason I downloaded Safari for Windows and OMG MY EYES! THE GOGGLES, THEY DO NOTHING!

So it turns out I don’t much like the blurry eyestrain-inducing font rendering. I’m pretty late to the party here but I’ll pile on anyway.

It seems that Safari ignores Windows’ native font rendering—ClearType, on my machine—and use Apple’s own, Quartz. I would have expected them to be more or less the same, as they both mostly rely on subpixel rendering. But it turns out Quartz ignores TrueType hinting, relying only on the subpixel anti-aliasing, which is to say, blurrification. It’s especially noticeable in the vertical direction, where subpixels help not at all.

Apparently—and I don’t know how authoritative this is—Apple in its wisdom chose to ignore hinting in order to Preserve Design Intent. TrueType hints force features of glyphs to the pixel grid, meaning that some strokes get thinner and some thicker, that relative weights of bold and regular styles are distorted, that curves flatten out, and spacing can be a bit wonky. Relying solely on antialiasing does mitigate those problems, at the cost of blurring (and chromatic fringing, with subpixel renderers).

Windows’ ClearType does respect hinting, at least to some extent. Much as it pains me to say it, I’m with Microsoft on this one. [Caveat: ClearType’s effectiveness depends enormously on the monitor. It looks great on the relatively high-res laptop I’m using now. I’ve tried it on other monitors (yes, LCD ones) where it was unbearable. I imagine it depends heavily on the user as well.]

Now preserving design intent does make an enormous amount of sense in some contexts. If you’re actually trying to lay out a page for printing, font distortion is bad, and a bit of blurring is no biggie. Were I a graphic designer I’m sure I’d demand the better approximation of font weights and glyph shapes and positions and not care about the blurring.

But I’m not a graphic designer, I’m just some guy trying to read stuff on the web. I care much more about not getting a headache than about the aesthetic qualities of lower-case g’s descender. I can read slightly malformed but clear fonts much more easily than blurry ones.

Even from a philosophical point of view “design intent” is a tricky concept. TrueType hints are design intent. A lot of effort went into hinting Georgia and Verdana and the other core web fonts. So does the real intent reside entirely in the glyph outlines, or in the hinting as well? Well, really it sort of depends. Print and screen are different contexts (they will converge as screen resolutions get higher, but they haven’t yet), and good designers design for both. Apple, it seems to me, does not respect the latter. For a desktop publishing program that would make sense. But we’re talking about a web browser here.

[Yes, there are lots of crappily hinted fonts out there, including some that are beautiful in print. With those Quartz can look much nicer than ClearType, even to me. But a well-desgined website really should not use those fonts.]

It may be that were I a Mac person I would be used to Apple’s rendering and find Windows’ godawful and primitive. But I’m not. Does Apple want to show us benighted Windows drudges The Light? If so, it’s not working. I want Windows apps to act like Windows apps—consistency is important. Safari want to act like a Mac app, both in rendering and in a few other ways, things like the way it displays its scrollbars (yes, that’s pretty trivial). The weird part is that Apple understands better than anyone else the importance of strict user interface guidelines. Do they care about them only when the guidelines are their own?