bug-gnu-emacs
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

bug#25890: `color-values` gives wrong value


From: Eli Zaretskii
Subject: bug#25890: `color-values` gives wrong value
Date: Fri, 03 Mar 2017 16:16:47 +0200

> Date: Tue, 28 Feb 2017 15:54:46 -0800 (PST)
> From: Drew Adams <drew.adams@oracle.com>
> Cc: 25890@debbugs.gnu.org, Rasmus <rasmus@gmx.us>
> 
> > >  (defun color-rgb-to-hex  (red green blue)
> > >    "Return hexadecimal notation for the color RED GREEN BLUE.
> > >  RED, GREEN, and BLUE should be numbers between 0.0 and 1.0, inclusive."
> > > -  (format "#%02x%02x%02x"
> > > -          (* red 255) (* green 255) (* blue 255)))
> > > +  (format "#%04x%04x%04x"
> > > +          (* red 65535) (* green 65535) (* blue 65535)))
> 
> The function should accept an optional arg NB-DIGITS, which
> specifies the number of hex digits for each of R, G, B.  And
> yes, it should default to 4 digits: #RRRRGGGGBBBB.
> 
> (That's what the original function in hexrgb.el does, from
> which color.el was supposedly derived.)

The code in hexrgb.el produces strange results in this regard (e.g.,
it produces "#FFFFFFFFE0E0" instead of "#FFFFFFFFE000" for the color
mentioned by the OP).  I believe that's because it interprets the
conversion between 2 and 4 hex digits incorrectly: the 2 hex digits
are the _most_ significant bits of the 4-digit version, not the LSB.

But I did add such an optional argument to color-rgb-to-hex, with the
difference that its value can only be either 4 or 2, as I see no
reason for anyone to want a 12-bit per component color notations.

Thanks.





reply via email to

[Prev in Thread] Current Thread [Next in Thread]