It’s actually worth running ChatGPT on a NAS – tech enthusiasts put an Nvidia RTX GPU into a 12-bay NAS, powered by an AMD EPYC CPU, and the results are surprising
If you’ve ever wondered what happens when you put a GPU in a NAS and run it AI, like a private ChatGPT, then Storage assessment has the surprising answer.
For the basis of his project, the site Jordan Ranous used QNAP’s TS-h1290FX, a 12 NVMe NAS powered by an AMD EPYC 7302P CPU and featuring 256GB of DRAM, 25GbE connectivity and plenty of PCI slots. He chose that NAS because it supports an internal GPU and has the ability to host up to 737 TB of raw storage.
By adding an Nvidia RTX A4000 GPU to the TS-h1290FX and configuring it for AI using Virtualization Station (a hypervisor for the NAS), Ranous was able to run AI workflows seamlessly.
Nvidia’s chat with RTX
Nvidia’s ChatRTX software suite handled the AI interaction side by providing a customized experience through a GPT-based LLM with a local, unique data set. This enabled fast, context-aware responses while maintaining privacy and security.
Storage assessment detailed the process of verifying hardware compatibility, installing the GPU, updating QNAP firmware and software, and installing the operating system on the VM, before configuring GPU passthrough, installing GPU drivers in the VM, and enabling passthrough functionality was verified.
The ease of setting up the GPU for AI on the QNAP NAS proves that it could work as a cost-effective and efficient solution for businesses looking to harness the power of AI. As Ranous says: “We’ve shown that adding a decent GPU to a QNAP NAS is relatively easy and cheap. We put an A4000 to work, and with a street price of about $1050, that’s not bad considering Virtualization Station is free and NVIDIA ChatRTX is available for free.”