bug-gnu-emacs
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

bug#19993: 25.0.50; Unicode fonts defective on Windows


From: Ilya Zakharevich
Subject: bug#19993: 25.0.50; Unicode fonts defective on Windows
Date: Tue, 10 Mar 2015 13:32:24 -0700
User-agent: Mutt/1.5.21 (2010-09-15)

On Tue, Mar 10, 2015 at 07:41:39PM +0200, Eli Zaretskii wrote:
> > > Choosing the first font which has a subset of a character “identified”
> > > is not a reasonable thing to do.
> > 
> > See my other messages: I'm not sure we actually do that.  It's
> > possible that the subrange test is used only as a filter, after we
> > already identified the candidate fonts.
> 
> In fact, it's almost certainly a filter: at least my reading of
> ww32font.c:font_matches_spec is that if the font spec specifies a
> script, then fonts that do NOT have the corresponding subrange bit set
> are rejected.

So back to the drawing board:
   • on your system
   • with Symbola installed
   • with the default configuration
I presume that Math Alphabeticals are not shown (but ARE shown when
Symbola is EXPLICITLY marked as the default font for them).

             WHY?

With my conjectures, the explanation would be that a certain other
font on the system has the Math Alphabeticals Subset “identified”, so
this font is chosen by Emacs — but in reality, this font does not
support the whole subset, so the needed glyphs are missing.  (For
example, DejaVu has Monospaced range, and nothing else. [Well, the
glyphs in the Monospaced range are totally broken, but that is
irrelevant for the current discussion!])

Without my conjecture, what would be your explanation?

Ilya





reply via email to

[Prev in Thread] Current Thread [Next in Thread]