Local-first by default
Ollama-friendly AI runtime, local storage, and explicit toggles for any optional web access.
Installable, verifiable AI assistance and retrieval/search that stays grounded and privacy-preserving.
Ollama-friendly AI runtime, local storage, and explicit toggles for any optional web access.
No blind scripts: view source, verify checksums/signatures, and keep compute opt-in and policy governed.
Citations and source labels across docs/catalog/grid/web (when enabled) with prompt-injection defenses.