TL;DR
Osaurus has announced a new software layer for Mac that allows users to run AI models locally or in the cloud, offering greater control and security. The open-source tool supports multiple models and integrates with popular cloud providers, aiming to reduce reliance on data centers.
Osaurus has officially launched a software platform for Mac that allows users to run local AI models or connect to cloud-based providers, providing a flexible and secure AI environment.
Developed as an open-source project, Osaurus enables Mac users to select from a variety of AI models, including locally hosted models like MiniMax M2.5 and DeepSeek V4, or cloud-based options such as OpenAI and Anthropic. The platform acts as a control layer or ‘harness,’ integrating different models and tools within a single interface. It emphasizes security by running AI in hardware-isolated virtual sandboxes, reducing potential vulnerabilities common in developer-focused tools.
To operate local models effectively, systems should have at least 64 GB of RAM, with larger models like DeepSeek V4 requiring around 128 GB. Despite resource demands, Osaurus co-founder Terence Pae highlighted significant improvements in local AI capabilities over recent years, noting that models now can perform complex tasks like writing code, browsing, and ordering goods, all on a Mac. The platform also supports over 20 native plugins, including Mail, Calendar, and Filesystem, and recently added voice capabilities.
Why It Matters
This development matters because it offers Mac users a way to run AI models privately on their own hardware, reducing reliance on cloud services and data centers. It addresses privacy concerns, especially relevant for sectors like healthcare and legal, and could lead to decreased energy consumption and infrastructure demands. The platform’s flexibility and security features may accelerate adoption of local AI, impacting the broader AI ecosystem and cloud provider strategies.
Mac compatible AI model hardware
As an affiliate, we earn on qualifying purchases.
As an affiliate, we earn on qualifying purchases.
Background
Osaurus originated from the idea of creating a personal AI assistant that could operate entirely on a Mac, inspired by user feedback on existing AI apps that still relied heavily on paid tokens and cloud processing. The project, led by Terence Pae, who previously worked at Tesla and Netflix, evolved over the past year into a versatile, open-source solution. The platform supports a wide range of models, both open-source and proprietary, and aims to bridge the gap between high-resource local AI and cloud-based solutions, which dominate the industry.
“The potential of local AI is growing rapidly, with models now capable of doing much more than just basic tasks. We believe Osaurus can empower users to run AI privately and efficiently on their Macs.”
— Terence Pae, Osaurus co-founder
“Our goal is to make AI more accessible and secure for individual users and businesses, reducing dependence on cloud infrastructure and promoting privacy.”
— Sam Yoo, Osaurus co-founder
high RAM external SSD for AI processing
As an affiliate, we earn on qualifying purchases.
As an affiliate, we earn on qualifying purchases.
What Remains Unclear
It remains unclear how widely adopted Osaurus will become among mainstream users or enterprises, given the current hardware requirements and technical complexity. The performance of local models on typical consumer hardware is still limited, and the scalability for larger organizations is yet to be demonstrated.

Engineering AI on Apple Silicon: Unified Memory, Metal Compute, MLX, and Core ML for On-Device Intelligence
As an affiliate, we earn on qualifying purchases.
As an affiliate, we earn on qualifying purchases.
What’s Next
Next steps include expanding support for additional AI models, improving user interface ease-of-use, and exploring enterprise applications, particularly in privacy-sensitive sectors. The team is also participating in the Alliance startup accelerator, which could lead to further funding and development opportunities.
voice recognition plugins for Mac
As an affiliate, we earn on qualifying purchases.
As an affiliate, we earn on qualifying purchases.
Key Questions
Can I run Osaurus on any Mac?
Osaurus requires a Mac with at least 64 GB of RAM for most models, and 128 GB for larger models. Compatibility with older Macs may be limited due to hardware demands.
Does Osaurus support all AI models?
It supports a variety of models including MiniMax M2.5, Gemma 4, Qwen3.6, GPT-OSS, Llama, and DeepSeek V4, along with Apple’s on-device models and cloud providers like OpenAI and Anthropic.
Is Osaurus secure for enterprise use?
Osaurus runs AI models in hardware-isolated sandboxes, which enhances security. However, enterprise deployment details and compliance standards are still under development.