cortexso
« Back to VersTracker
Description:
Drop-in, local AI alternative to the OpenAI stack
Type: Formula  |  Latest Version: 0.1.1@0  |  Tracked Since: Dec 17, 2025
Links: Homepage  |  formulae.brew.sh
Category: Ai ml
Tags: ai llm local-ai openai inference
Install: brew install cortexso
About:
Cortexso provides a local, open-source AI inference server that is API-compatible with OpenAI, allowing developers to run LLMs on their own hardware without cloud dependencies. It simplifies the deployment and management of AI models with a drop-in replacement for the OpenAI SDK. This ensures data privacy, reduced latency, and cost control for AI-powered applications.
Key Features:
  • OpenAI API compatibility
  • Local model inference
  • Easy model management
  • Drop-in SDK replacement
Use Cases:
  • Running AI applications locally for data privacy
  • Developing against OpenAI APIs without incurring cloud costs
  • Prototyping LLM features offline
Alternatives:
  • Ollama – Ollama focuses on ease of use via a CLI, while Cortexso emphasizes API compatibility with the OpenAI stack.
  • LocalAI – LocalAI offers broad feature support; Cortexso provides a more streamlined, specific drop-in replacement for OpenAI.
Version History
Detected Version Rev Change Commit
Sep 16, 2025 7:14pm 0 VERSION_BUMP 2243d976