emacs-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [NonGNU ELPA] New package: llm


From: Daniel Fleischer
Subject: Re: [NonGNU ELPA] New package: llm
Date: Sat, 19 Aug 2023 11:15:37 +0300
User-agent: Gnus/5.13 (Gnus v5.13)

Local LLMs usually run using the python `transformers' library; in order
to interact with them using a REST API, some glue code is needed, for
example:

https://github.com/go-skynet/LocalAI

The API is based on OpenAI.com which is what others are following and
thus are relevant for the API access the llm package is going to offer. 


Richard Stallman <rms@gnu.org> writes:

> We are slightly miscommunicating.  Yes there are models that could run
> locally on your machine, but all the ones I know of were released
> under a nonfree license.

> Could you confirm that this is a language model itself, not the
> program that runs the language model?

The most popular software framework for running LLMs is called
`transformers' (named after the models' architecture):

https://github.com/huggingface/transformers  (Apache 2)

Huggingface also offers free hosting for models and data sets. There are
several families of free models: 

- XGEN     https://huggingface.co/Salesforce/xgen-7b-8k-base
- MPT      https://huggingface.co/mosaicml/mpt-7b
- Falcon   https://huggingface.co/tiiuae/falcon-7b

These are git project, e.g. see
https://huggingface.co/tiiuae/falcon-7b/tree/main.

These models are released under Apache 2. The models contains the
weights (compressed numerical matrices) and possibly some Python code
files needed and they explicitly depend on the `transformers' library
and the `pytorch' neural networks library (BSD-3).

-- 
Daniel Fleischer



reply via email to

[Prev in Thread] Current Thread [Next in Thread]