https://lwn.net/Articles/971195/
Portable LLMs with llamafile [LWN.net]
Large language models (LLMs) have been the subject of much discussion and scrutiny recently. O [...]
portablellmsllamafilelwn
https://www.theregister.com/2024/04/03/llamafile_performance_gains/
Llamafile LLM driver project boosts performance on CPU cores • The Register
Apr 3, 2024 - Way to whip that LLaMA's ass
the registerllamafilellmdriverproject
https://www.mozilla.ai/open-tools/llamafile
llamafile - Run OS LLMs locally from a single executable file
Bundle a full LLM into a single executable, combining model weights, inference engine, and runtime. Use llamafile if you want the convenience, privacy, and...
llamafilerunosllmslocally