sanctum
« Back to VersTracker
Description:
Run LLMs locally
Type: Cask  |  Latest Version: 1.9.1@0  |  Tracked Since: Dec 28, 2025
Links: Homepage  |  formulae.brew.sh
Category: Ai ml
Tags: llm ai local-ai privacy chat
Install: brew install --cask sanctum
About:
Sanctum is a desktop application designed to run large language models (LLMs) locally on your machine. It provides a user-friendly interface for downloading, managing, and interacting with various open-source models. This ensures complete data privacy and offline capability while leveraging powerful AI tools.
Key Features:
  • Local-first architecture for data privacy
  • Easy model management and switching
  • User-friendly chat interface
  • Support for popular open-source models
Use Cases:
  • Running AI coding assistants without sending data to the cloud
  • Experimenting with LLMs offline
  • Privacy-sensitive text analysis
Alternatives:
  • Ollama – Ollama is a command-line tool with a developer focus, whereas Sanctum provides a GUI-first experience.
  • LM Studio – LM Studio is a direct competitor offering a similar GUI for local model inference.
Version History
Detected Version Rev Change Commit