vile
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [vile] vile-9.6j.patch.gz


From: Thomas Dickey
Subject: Re: [vile] vile-9.6j.patch.gz
Date: Wed, 26 Mar 2008 18:09:32 -0400 (EDT)

On Wed, 26 Mar 2008, Paul van Tilburg wrote:

On Tue, Mar 25, 2008 at 08:01:17PM -0400, Thomas Dickey wrote:
 ------------------------------------------------------------------------------

 20080325
        + correct bytes versus characters for inserting/replacing text in
          utf-8 buffers (reports by Chris Green, Paul Van Tilburg).

Hmm, nothing changed for me.. characters with accents still get inserted
as \xC3.
Anything I can test?

I'm puzzled, since it's working for me (I tried nl_NL.utf8 here).
So I'm either building it differently, or running it differently.

Is this in insert-mode? With some specific characters? Perhaps the driver is related: If it's configured --with-screen=curses or --with-screen=ncurses, those are not UTF-8 capable (though --with-screen=ncursesw should work). I'm generally using the default
terminfo/termcap driver.

I assume you're setting file-encoding=utf-8 as mentioned last week.
Without a specific setting, vile's reading characters according to
the locale (with or without iconv), and applying them to the buffers
according to _their_ file-encoding.

With xterm, I "just" use a meta-whatever to get upper-128 codes
(which uxterm translates into UTF-8).  A quick check using vile's
^VxNN seems to be working.

The only odd thing that I'm seeing at the moment is when editing
with file-encoding=auto, that I didn't provide a clear way to
see what encoding the current buffer is using in the [Settings]
window.  I can do a ":show-var" and see $encoding in the [Variables],
and that's consistent.

--
Thomas E. Dickey
http://invisible-island.net
ftp://invisible-island.net




reply via email to

[Prev in Thread] Current Thread [Next in Thread]