Audience: New users and system administrators seeking a high-level overview of NeuralDrive.
Introduction
NeuralDrive is a specialized Linux distribution designed to turn any x86_64 computer into a high-performance, headless Large Language Model (LLM) inference server. By booting directly from a LiveUSB or LiveCD, you can deploy a complete AI stack—including GPU drivers, runtimes, and web interfaces—without modifying your existing operating system or performing complex manual installations.
How to Use This Documentation
This guide is structured to take you from initial hardware selection to advanced model management. Throughout the manual, you will encounter audience badges that indicate the technical depth of specific sections:
- [User]: General concepts and web interface usage.
- [Admin]: Network configuration, security settings, and hardware management.
- [Developer]: API integration and custom image building.
If you are ready to begin, proceed directly to the Quick Start guide.
Version Note
This documentation covers NeuralDrive based on Debian 12 (Bookworm). It includes support for the latest stable releases of Ollama, Open WebUI, and major GPU compute stacks including NVIDIA CUDA 12.x and AMD ROCm 6.x.