[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: kbd vs read-key-sequence

From: Richard Stallman
Subject: Re: kbd vs read-key-sequence
Date: Sun, 11 Mar 2007 16:01:26 -0400

    Off the top of my head:
    - saving may change the char thanks to unify-8859-on-encoding
      (e.g. you copy a iso-8859-15 char to your .emacs which you then save in

Is there a procedure we can recommend for the user to avoid this?

    - the coding-system may not be properly detected, especially if you
      configure your auto-dtection in your .emacs: the configuration will apply
      to all files you open but not to the loading of .emacs

Can you avoid this by specifying the coding-system explicitly in .emacs?
We could recommend that users do so.

    > Anyway, if you want to bind a character with modifiers, you can just use
    > a construct in .emacs that applies the modifier to the desired base
    > character, such as \M- in a string, or (meta CHAR).

    > Is there a case where that doesn't work?

    Because the way the event is decoded through read-key-sequence is not
    necessarily the same.  E.g. while ?<encoded-é> may get turned into ?é, it
    may be the case that ?\M-<encoded-é> stays unchanged.

How does that happen?  Maybe that is a bug; if so, we should fix it.

We could recommend that people write (meta ?<encoded-é>).
That would eliminate this particular problem, right?

These do not make things perfect, but I think they might be enough
to enable users to get reliable results.  What do you think?

reply via email to

[Prev in Thread] Current Thread [Next in Thread]