|
« Back to VersTracker
|
All Categories
|
All Tags
|
Related:
ai
developer-tools
productivity
machine-learning
cli
local-ai
chat
assistant
ollama
privacy
|
| Package |
Description |
Version |
|
notesollama
☆
cask
|
LLM support for Apple Notes through Ollama |
0.2.6 |
|
ollama
☆
cask
|
Get up and running with large language models locally |
0.3.6 |
|
ollama
☆
formula
|
Create, run, and share large language models (LLMs) |
0.13.4 |
|
ollama-app
☆
cask
|
Get up and running with large language models locally |
0.16.1 |
|
ollamac
☆
cask
|
Interact with Ollama models |
3.0.3 |
|
opencat
☆
cask
|
Native AI chat client |
2.88.1,1909 |
|
openclaw
☆
cask
|
Personal AI assistant |
2026.2.14 |
|
osaurus
☆
cask
|
LLM server built on MLX |
0.10.7 |
|
pdl
☆
cask
|
Declarative language for creating reliable, composable LLM prompts |
0.9.2 |
|
poe
☆
cask
|
AI chat client |
1.1.39 |
|
promptfoo
☆
formula
|
Test your LLM app locally |
0.120.8 |
|
qianwen
☆
cask
|
AI assistant and chatbot powered by Alibaba's Qwen model |
2.0.0,2601311354 |
|
qqqa
☆
formula
|
Fast, stateless LLM for your shell: qq answers; qa runs commands |
|
|
ramalama
☆
formula
|
Goal of RamaLama is to make working with AI boring |
0.15.0 |
|
rawdog
☆
formula
|
CLI tool to generate and run code with llms |
0.1.6 |
|
rivet
☆
cask
|
Open-source visual AI programming environment |
1.11.3 |
|
sanctum
☆
cask
|
Run LLMs locally |
1.9.1 |
|
shimmy
☆
formula
|
Small local inference server with OpenAI-compatible GGUF endpoints |
1.8.2 |
|
tenere
☆
formula
|
TUI interface for LLMs written in Rust |
|
|
void
☆
cask
|
AI code editor |
1.99.30044 |
|
yek
☆
formula
|
Fast Rust based tool to serialize text-based files for LLM consumption |
|
|
yuanbao
☆
cask
|
Tencent AI Assistant with Hunyuan and DeepSeek LLMs |
2.56.0.621,9dee72359c29c952a90675fec81af6c6 |