|
« Back to VersTracker
|
All Categories
|
All Tags
|
Related:
ai
productivity
developer-tools
local-ai
cli
chatbot
machine-learning
automation
privacy
macos
|
| Package |
Description |
Version |
|
llama.cpp
☆
formula
92,086
|
LLM inference in C/C++ |
7540 |
|
gemini-cli
☆
formula
88,828
|
Interact with Google Gemini AI models from the command-line |
0.22.2 |
|
context7-mcp
☆
formula
40,226
|
Up-to-date code documentation for LLMs and AI code editors |
1.0.33 |
|
repopack
☆
formula
21,729
|
Pack repository contents into a single AI-friendly file |
|
|
gitingest
☆
formula
13,473
|
Turn any Git repository into a prompt-friendly text ingest for LLMs |
0.3.1 |
|
mlx-lm
☆
formula
3,153
|
Run LLMs with MLX |
0.29.0 |
|
oterm
☆
formula
2,291
|
Terminal client for Ollama |
0.14.7 |
|
mcphost
☆
formula
1,501
|
CLI host for LLMs to interact with tools via MCP |
0.32.0 |
|
macai
☆
cask
778
|
Native chat application for all major LLM APIs |
2.4.2 |
|
toktop
☆
formula
144
|
LLM usage monitor in terminal |
|
|
5ire
☆
cask
|
AI assistant and MCP client |
0.15.1 |
|
agentkube
☆
cask
|
AI-powered Kubernetes IDE |
0.0.11 |
|
aichat
☆
formula
|
All-in-one AI-Powered CLI Chat & Copilot |
0.30.0 |
|
aider
☆
formula
|
AI pair programming in your terminal |
0.86.1 |
|
alma
☆
cask
|
AI chat application |
0.0.486 |
|
anythingllm
☆
cask
|
Private desktop AI chat application |
1.10.0 |
|
ava
☆
cask
|
Run language models locally on your computer |
2024-04-21 |
|
backyard-ai
☆
cask
|
Run AI models locally |
0.37.0 |
|
cai
☆
formula
|
CLI tool for prompting LLMs |
0.12.0 |
|
chatall
☆
cask
|
Concurrently chat with ChatGPT, Bing Chat, Bard, Claude, ChatGLM and more |
1.85.110 |
|
chatbox
☆
cask
|
Desktop app for GPT-4 / GPT-3.5 (OpenAI API) |
1.19.0 |
|
chatglm
☆
cask
|
Desktop client for the ChatGLM AI chatbot |
1.1.7 |
|
chatgpt
☆
cask
|
OpenAI's official ChatGPT desktop app |
1.2026.027,1769832365 |
|
chatwise
☆
cask
|
AI chatbot for many LLMs |
0.9.80 |
|
cherry-studio
☆
cask
|
Desktop client that supports multiple LLM providers |
1.7.19 |
|
code2prompt
☆
formula
|
CLI tool to convert your codebase into a single LLM prompt |
4.2.0 |
|
cortexso
☆
formula
|
Drop-in, local AI alternative to the OpenAI stack |
0.1.1 |
|
doubao
☆
cask
|
AI chat assistant |
2.0.31 |
|
fastmcp
☆
formula
|
Fast, Pythonic way to build MCP servers and clients |
2.14.1 |
|
gorilla-cli
☆
formula
|
LLMs for your CLI |
|
|
gpt4all
☆
cask
|
Run LLMs locally |
3.10.0 |
|
gptme
☆
formula
|
AI assistant in your terminal |
0.31.0 |
|
gptscript
☆
formula
|
Develop LLM Apps in Natural Language |
0.9.8 |
|
huggingchat
☆
cask
|
Chat client for models on HuggingFace |
0.7.0 |
|
itermai
☆
cask
|
Enable generative AI features in iTerm2 |
1.1 |
|
jan
☆
cask
|
Offline AI chat tool |
0.7.7 |
|
kagent
☆
formula
|
Kubernetes native framework for building AI agents |
0.7.7 |
|
langflow
☆
cask
|
Low-code AI-workflow building tool |
1.6.2,1.6.9 |
|
langgraph-cli
☆
formula
|
Command-line interface for deploying apps to the LangGraph platform |
0.4.11 |
|
llamabarn
☆
cask
|
Menu bar app for running local LLMs |
0.24.0 |
|
llamachat
☆
cask
|
Client for LLaMA models |
1.2.0 |
|
llm
☆
formula
|
Access large language models from the command-line |
|
|
lm-studio
☆
cask
|
Discover, download, and run local LLMs |
0.4.2,2 |
|
lobehub
☆
cask
|
AI chat framework |
2.1.30 |
|
localai
☆
formula
|
OpenAI alternative |
3.8.0 |
|
msty
☆
cask
|
Run LLMs locally |
1.9.2 |
|
msty-studio
☆
cask
|
AI platform with local and online models |
2.2.1 |
|
mstystudio
☆
cask
|
AI platform with local and online models |
2.4.1 |
|
mstystudio@latest
☆
cask
|
Next-Generation Privacy-first AI platform with local and online models |
|
|
nanobot
☆
formula
|
Build MCP Agents |
0.0.46 |