Tag: inference 11 packages with this tag
« Back to VersTracker  |  All Categories  |  All Tags  |  Related: machine-learning ai llm local-ai c++ quantization openai model-serving deep-learning mlops
Package Description Version
llama.cpp formula 92,086 LLM inference in C/C++ 7540
cortexso formula Drop-in, local AI alternative to the OpenAI stack 0.1.1
djl-serving formula This module contains an universal model serving implementation 0.35.0
libtensorflow formula C interface for Google's OS library for Machine Intelligence
ncnn formula High-performance neural network inference framework 20250916
onnxruntime formula Cross-platform, high performance scoring engine for ML models 403d652dca4c1046e8145950b1c0997a9f748b57
openvino formula Open Visual Inference And Optimization toolkit for AI inference 2025.4.0
ramalama formula Goal of RamaLama is to make working with AI boring 0.15.0
shimmy formula Small local inference server with OpenAI-compatible GGUF endpoints 1.8.2
swama cask Machine-learning runtime 2.0.1
text-embeddings-inference formula Blazing fast inference solution for text embeddings models 1.8.3