|
« Back to VersTracker
|
All Categories
|
All Tags
|
Related:
llm
ai
privacy
machine-learning
chat
inference
chatbot
offline
macos
developer-tools
|
| Package |
Description |
Version |
|
llama.cpp
☆
formula
92,086
|
LLM inference in C/C++ |
7540 |
|
anythingllm
☆
cask
|
Private desktop AI chat application |
1.10.0 |
|
ava
☆
cask
|
Run language models locally on your computer |
2024-04-21 |
|
backyard-ai
☆
cask
|
Run AI models locally |
0.37.0 |
|
cortexso
☆
formula
|
Drop-in, local AI alternative to the OpenAI stack |
0.1.1 |
|
gpt4all
☆
cask
|
Run LLMs locally |
3.10.0 |
|
jan
☆
cask
|
Offline AI chat tool |
0.7.7 |
|
llamachat
☆
cask
|
Client for LLaMA models |
1.2.0 |
|
lm-studio
☆
cask
|
Discover, download, and run local LLMs |
0.4.2,2 |
|
localai
☆
formula
|
OpenAI alternative |
3.8.0 |
|
msty
☆
cask
|
Run LLMs locally |
1.9.2 |
|
msty-studio
☆
cask
|
AI platform with local and online models |
2.2.1 |
|
mstystudio
☆
cask
|
AI platform with local and online models |
2.4.1 |
|
mstystudio@latest
☆
cask
|
Next-Generation Privacy-first AI platform with local and online models |
|
|
ollama
☆
formula
|
Create, run, and share large language models (LLMs) |
0.13.4 |
|
sanctum
☆
cask
|
Run LLMs locally |
1.9.1 |
|
shimmy
☆
formula
|
Small local inference server with OpenAI-compatible GGUF endpoints |
1.8.2 |