[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: Improvement proposals for `completing-read'
From: |
Daniel Mendler |
Subject: |
Re: Improvement proposals for `completing-read' |
Date: |
Thu, 8 Apr 2021 23:30:26 +0200 |
On 4/8/21 10:44 PM, Dmitry Gutov wrote:
I tried to integrate `fzf` once with Consult async, like generating a
list outside Emacs, pushing it through `fzf` for fuzzy-filtering and
presenting it to the user via completion. But it turned out that most
of the external implementations are not good enough for this use case.
They don't have an option to open a pipe to update the filtering input
for example. I could write my own fuzzy matcher external backend which
would work perfectly with async completion. However then I can also
just wait for gccemacs :)
I was thinking more about interactions over network, with HTTP requests
sent and received asynchronously. Mainly the cases where one uses the
LSP protocol or similar.
Yes, this is all possible with async completion tables in Consult. There
is a consult-spotify package which queries some web api and there is
also consult-lsp in the works which accesses the lsp api
(https://github.com/minad/consult/issues/263).
You may want to take a look at my Consult package, specifically the
async functionality. I believe that this functionality can easily be
provided on top of the current infrastructure, and actually in a nice
way.
You can check out Company's asynchronous convention for backends:
https://github.com/company-mode/company-mode/blob/f3aacd77d0135c09227400fef45c54b717d33f2e/company.el#L456-L467
It's a very simple lambda-based future-like value. It can be updated to
use a named type, and with other features too. I think it's a clean and
simple base to build on, though.
Yes, this looks very simple. I actually prefer the functional style in
contrast to some named types as you have it in Company. So how is this
used? When completing the fetcher is called and as soon as it returns
via the callback the results are displayed? But chunking is not possible
and probably also not desired? See below for my response regarding
chunking in Consult.
In Consult I am using closures which hold the asynchronously acquired
data. The closure function must accept a single argument, it can
either be a string (the new input) or it can be a list of newly
obtained candidates.
I'm not sure where to look, sorry.
Take a look at `consult--async-sink` at
https://github.com/minad/consult/blob/3121b34e207222b2db6ac96a655d68c0edf1a449/consult.el#L1264-L1297.
These `consult--async-*` functions can be chained together to produce an
async pipeline. The goal here was to have reusable functions which I can
glue together to create different async backends. See for example the
pipeline for asynchronous commands:
https://github.com/minad/consult/blob/3121b34e207222b2db6ac96a655d68c0edf1a449/consult.el#L1505-L1513.
I'm not 100% clear, but it sounds like chunked iteration. Which is a
good feature to have. Though perhaps not always applicable to every UI
(blinking with new results in a completion popup might be not
user-friendly).
Indeed, the UI receives the candidates as soon as they come in. One has
to ensure that this does not freeze the UI with some timer. Then there
is also throttling when the user changes the input. It works very well
for the `consult-grep` style commands. You may want to try those. They
are similar to `counsel-grep` but additionally allow to filter using the
Emacs completion style. Take a look here, in case you are interested
https://github.com/minad/consult#asynchronous-search. Note that you
don't have to use chunking necessarily. You can use a single chunk if
the whole async result arrives in bulk.
Now a single problem remains - if new data is incoming the async data
source must somehow inform completion that new candidates are
available. In order to do this the async source must trigger the UI
for example via icomplete-exhibit/selectrum-exhibit and so on. It
would be good to have a common "completion-refresh" entry point for
that. In Consult I have to write a tiny bit of integration code for
each supported completion system.
See my link, perhaps.
Or in general, a Future/Promise API has a way to subscribe to the
value's update(s) (and the completion frontend can do that).
Having to use a global variable seems pretty inelegant in comparison.
It is not a global variable but a function. But for sure, one could also
design the async future such that it receives a callback argument which
should be called when new candidates arrive. The way I wrote it in
Consult is that the `consult-async-sink` handles a 'refresh action,
which then informs the completion UI.
No hurry at all. Sometimes, though, a big feature like that can inform
the whole design from the outset.
Yes, sure. When planning to do a big overhaul you are certainly right.
But currently I am more focused on fixing a few smaller pain points with
the API, like retaining text properties and so on.
Daniel Mendler
Re: Improvement proposals for `completing-read', Jean Louis, 2021/04/07
Re: Improvement proposals for `completing-read', Dmitry Gutov, 2021/04/07
- Re: Improvement proposals for `completing-read', Daniel Mendler, 2021/04/08
- Re: Improvement proposals for `completing-read', Dmitry Gutov, 2021/04/08
- Re: Improvement proposals for `completing-read',
Daniel Mendler <=
- Re: Improvement proposals for `completing-read', Dmitry Gutov, 2021/04/09
- Re: Improvement proposals for `completing-read', Daniel Mendler, 2021/04/10
- Re: Improvement proposals for `completing-read', Dmitry Gutov, 2021/04/10
- Re: Improvement proposals for `completing-read', Daniel Mendler, 2021/04/11
- Re: Improvement proposals for `completing-read', Dmitry Gutov, 2021/04/11
- Re: Improvement proposals for `completing-read', Daniel Mendler, 2021/04/11
- Re: Improvement proposals for `completing-read', Dmitry Gutov, 2021/04/12
- Re: Improvement proposals for `completing-read', Daniel Mendler, 2021/04/12
- Re: Improvement proposals for `completing-read', Dmitry Gutov, 2021/04/13
- Re: Improvement proposals for `completing-read', Daniel Mendler, 2021/04/14