A NAS is a great addition to any homelab setup. It can back up your photos and other files, run a Jellyfin or Plex server, and replace a few of your subscription services.
It is tempting to think that they’d also be a good way to get into local AI. Unfortunately, their hardware really isn’t up to the job. If you need something compact and energy efficient, there are better, most cost-effective options available.
A NAS isn’t the answer for local AI
They’re convenient but not the best
NAS units are readily available, relatively affordable, and popular, since they can help you cut subscription costs. Increasingly, they’re also being used to self-host some AI applications. However, you shouldn’t buy one specifically for AI.
Most major NAS manufacturers build their units with low-power CPUs meant for serving files without consuming a ton of electricity, not handling AI workloads. Even if you have a high-end unit with 32GB of RAM or more, you’re looking at only a few tokens per second. You’d find yourself waiting on a machine that generates text slower than you can type.
If you already have a NAS, there is certainly no harm in using it for AI—just don’t spend money on one with that purpose in mind.
Unique and creative DIY NAS setups
Trivia challenge
From old laptops to dusty routers — find out how well you know the wild world of homemade network storage.
HardwareNetworkingSoftwareDIY BuildsStorage
Which major advantage makes an old laptop a surprisingly good candidate for a DIY NAS build?
Correct! A laptop’s built-in battery acts like a mini UPS (uninterruptible power supply), protecting your data from sudden power outages. This is a significant perk that desktop-based NAS builds don’t get for free.
Not quite. The big hidden advantage of a laptop NAS is its built-in battery, which functions as a natural UPS. This keeps the system running briefly during power cuts, protecting data integrity without any extra hardware.
Which open-source firmware is most commonly flashed onto compatible routers to enable NAS-like USB storage sharing features?
Correct! OpenWrt is a Linux-based open-source firmware that replaces stock router firmware and adds powerful features, including USB storage sharing via Samba or NFS, turning a basic router into a lightweight NAS.
Not quite. OpenWrt is the go-to open-source firmware for repurposing routers. Once flashed, it supports USB drives connected to the router’s USB port, enabling basic NAS functionality like Samba file sharing on a very small budget.
Which NAS operating system is specifically designed to run well on low-power ARM-based single-board computers like the Raspberry Pi?
Correct! OpenMediaVault (OMV) is a Debian-based NAS OS that supports ARM architectures, making it a popular choice for Raspberry Pi NAS builds. It’s lightweight, free, and has a web-based GUI that simplifies setup.
Not quite. OpenMediaVault is the answer. Unlike TrueNAS or Unraid, OMV is optimized to run on ARM processors, which is why it’s the community favorite for Raspberry Pi-powered NAS projects.
When building a NAS using a Raspberry Pi, what is the most common bottleneck that limits file transfer speeds?
Correct! On older Raspberry Pi models (prior to the Pi 4), both the USB ports and the Ethernet port shared the same USB 2.0 bus, creating a significant bottleneck when transferring data between network and storage simultaneously.
Not quite. The real culprit on older Raspberry Pi models is the shared USB and Ethernet bus. Because both the network adapter and USB storage competed for the same bandwidth, real-world NAS speeds were often far below what the hardware theoretically promised.
What is a ‘Franken-NAS’ commonly referred to in DIY storage communities?
Correct! A ‘Franken-NAS’ is a beloved DIY term for a NAS cobbled together from spare and salvaged parts — old desktop cases, mixed hard drives, and recycled motherboards all stitched together into one functional (if ugly) storage machine.
Not quite. A Franken-NAS refers to a storage build assembled from mismatched, salvaged components — think old desktop parts, second-hand drives, and whatever case happens to fit. It’s a badge of honor in the DIY NAS community.
Which RAID level is recommended for a small 2-drive DIY NAS that prioritizes data redundancy over total storage capacity?
Correct! RAID 1 mirrors data identically across two drives, meaning if one drive fails, your data survives on the other. It cuts your total usable capacity in half but provides simple, reliable redundancy — perfect for a two-drive home NAS.
Not quite. RAID 1 is the right answer for a two-drive redundancy setup. RAID 0 stripes data for speed but has zero redundancy, and RAID 5 or 6 require three or more drives. RAID 1 mirrors your data across both drives for straightforward protection.
What protocol do most DIY NAS builders configure to allow Windows PCs on the local network to browse shared folders like a network drive?
Correct! Samba implements the SMB (Server Message Block) protocol on Linux and Unix systems, enabling seamless file sharing with Windows machines. It’s the standard choice for home NAS builds because Windows natively understands SMB shares.
Not quite. Samba, which uses the SMB protocol, is the standard answer here. It allows Linux-based NAS systems to present their shares in a way Windows PCs understand natively, so you can map them as network drives without any extra client software.
Which low-power x86 platform became extremely popular for DIY NAS and home server builds due to its fanless design and efficient Intel Atom or Celeron processors?
Correct! Compact Chinese-manufactured mini PC boards from brands like Topton and Cwwk, featuring Intel’s N100 or N5105 processors, became hugely popular in the DIY NAS community around 2022–2024. They offer multiple 2.5GbE ports, low power draw, and multiple SATA connections at a very low price.
Not quite. The Topton and Cwwk N100-based mini PC motherboards became a community favorite for budget DIY NAS builds. They pack multiple Ethernet ports, SATA connections, and efficient modern CPUs into a tiny, affordable package that traditional options couldn’t match at the price.
Your Score
/ 8
Thanks for playing!
What matters most for AI
NAS hardware is optimized for things like power consumption and number of drive bays—neither of which really help with AI workloads. You shouldn’t pay too much attention to NPU marketing at this point either. Most of the important AI tools, like Ollama, llama.cpp, and LM Studio, don’t currently send AI workloads to the NPU anyway.
Install and Use AI Chatbots at Home With Ollama
Try out your very own AI chatbot privately and securely at home.
Note: It may be possible for future software to simultaneously take advantage of an integrated GPU and an NPU.
Memory bandwidth is also a significant limitation, which is why the unified memory approach used by modern Macs gives them an edge over other mini PCs that use RAM sticks.
A Mac Mini is a surprisingly good option
Great performance for all tasks
If you need something compact and power efficient to run small to medium-sized AI models, a Mac Mini is a pretty good option. The Mac Mini can be equipped with up to 64GB of unified memory for $2000, which allows it to easily run models with 30 billion (30B) parameters. You could probably squeeze in some quantized models with up to 70B parameters, but you’re going to run into some performance bottlenecks.
- Storage
-
256GB
- CPU
-
Apple M4 10-Core
- Memory
-
16GB
- Graphics
-
10-Core M4 GPU
Powered by an impressive M4 chip, the redesigned Mac Mini starts with 16GB RAM, 256GB SSD, a 10-core CPU, and a 10-core GPU.
Since it works out of the box with LM Studio and Ollama, you can easily switch between using it as your day-to-day PC, home server, or dedicated local AI box.
The Mac Mini has one limit
The only real drawback of the Mac Mini is the memory limit. With a 64GB max on the M4 Pro, you won’t be able to run models with more than about 70 billion parameters. If you need to run larger models, you have to jump up to a Mac Studio, which is significantly more expensive.
A mini PC with an AMD AI chip is great too
More memory than you know what to do with
However capable Mac Minis are, they’re not the only mini PC that can serve as an at-home AI server. If you’re looking for a reasonable option that costs less than the Mac Mini, start with mini PCs that have an AMD Ryzen AI 9 HX 370.
As one example, MINISFORUM produces a mini PC with a Ryzen AI 9 HX 370 and 32GB, 64GB, or 96GB of RAM that starts around $1,100. Multiple manufacturers produce models with up to 96GB of RAM, though those usually cost around $2,000.
If you need even more power, I’d suggest looking at mini PCs with the Ryzen AI Max+ 395 or the AI Max+ 388. They support up to 128GB of unified memory, and you could allocate up to 96GB as VRAM. This allows you to comfortably run 70B+ models that would be impossible to fit on a Mac Mini.
Unfortunately, mini PCs with the AI Max+ 395 processor and 128GB of RAM are pretty pricey, though they tend to be a bit less expensive than the Mac Studio with an equivalent amount of RAM. When the newer AI Max+ 388 becomes widely available, it’ll likely be a bit cheaper and may be a good option if you’re looking to save some money on an AI PC.
ROCm provides an edge
AMD’s ROCm has matured quite a bit, and it now runs llama.cpp, Ollama, and LM Studio without too many issues. It’s a reasonable choice if you prefer Linux for headless servers.
AMD claims that you could see up to 12.2x faster time-to-first-token than Intel Lunar Lake on some models, the speed is potentially there. As an added perk, if you get a mini PC with OCuLink ports, you have the ability to add a separate GPU later on if you want the extra performance.
You shouldn’t skip on the NAS
Just because a NAS won’t be great for running AI models doesn’t mean they’re not worth anything. They’re great for what they are: convenient storage devices. If you don’t already have an in-home backup solution of some kind, I’d generally recommend buying one.
If you want a device that is both a NAS and an AI server, the most cost-effective option is to build your own from refurbished or secondhand parts. You can add as much RAM as you want, pick a GPU with enough VRAM to run the models you want, and then continually add new drives as your storage needs grow. A home server like that would also be capable of self-hosting almost any other service you want, giving you the option to cut subscriptions in favor of more privacy-friendly alternatives.

