macOS with Apple Silicon only

The Ferrari of RAG.
On your Mac.

m9 is a native desktop app that turns your Mac into an enterprise-grade retrieval machine. Hybrid pipeline, parallel embedding, absolute privacy. Your data never leaves your computer.

Request information
100%
Local
3x
Faster
0
Cloud data

Every detail engineered for performance

m9 is not a ChatGPT wrapper. It's a native RAG engine built from scratch to maximize Apple Silicon hardware.

Fused Hybrid Retrieval

BM25 (keyword) + Dense (semantic) pipeline combined with Reciprocal Rank Fusion. It doesn't rely on vectors alone: it also finds exact matches for specific terms, names, acronyms and codes.

Parallel Embedding

Leverages Apple Silicon Performance + Efficiency cores with batch-concurrent embedding. Hundreds of chunks per second, directly on the integrated GPU via Ollama.

Total Privacy

No data leaves your Mac. LLM and embedding run locally with Ollama. No remote servers, no cloud APIs, no tracking. Your documents stay yours.

Multi-Query Expansion

Before searching, m9 reformulates your question into different variants to maximize recall. Synonyms, paraphrases and automatic reformulations expand search coverage.

All Formats

PDF, DOCX, TXT, Markdown, web pages. Ingestion from files, folders or integrated web crawler with configurable depth and automatic download of PDFs found online.

Precise Citations

Every answer includes citations with reference to the source file and exact chunk. Maximum transparency: you can always verify where every piece of information comes from.

RAG pipeline in 6 stages

From ingestion to response, every stage is optimized for quality and speed.

1

Ingest

Parallel extraction from PDF, DOCX, TXT, Web

2

Chunk

Intelligent sentence-aware segmentation with overlap

3

Embed

Batch parallel embedding on Apple Silicon GPU

4

Index

Dual index: LanceDB (vectors) + MiniSearch (BM25)

5

Fuse

Reciprocal Rank Fusion with optimized BM25+Dense weights

6

Answer

Local LLM generates response with verifiable citations

Zero compromises on privacy

In an era when data is the new oil, m9 chooses a different path: everything stays on your Mac.

Your data never leaves. Ever.

m9 has no servers, no accounts, no telemetry. LLM and embedding run entirely locally via Ollama. The vector database (LanceDB) and metadata database (SQLite) are files on your disk. You can disconnect from the internet and m9 works exactly as before.

No Cloud No Account No Telemetria Offline Ready GDPR Compliant

Built with cutting-edge technologies

Modern stack, native macOS, no dependencies on external services.

Platform Native macOS (Apple Silicon M1/M2/M3/M4)
Framework Electron + React + TypeScript
Vector Database LanceDB (Rust, embedded)
Metadata Database SQLite (better-sqlite3)
LLM Provider Ollama (local, any model)
Embedding nomic-embed-text / bge-m3 / mxbai
Retrieval Hybrid BM25 + Dense + RRF
Supported Formats PDF, DOCX, TXT, MD, HTML (Web Crawler)
Web Crawler Integrated, with PDF download and domain filter
License Proprietary — Marinuzzi & Associati

Want m9 for your organization?

Fill out the form and we'll get back to you for a custom demo and quote.