Just when we thought we were safe, ChatGPT is coming for our graphics cards

>

Everyone seems to be talking about ChatGPT these days thanks to Microsoft Bing, but given the nature of large language models (LLMs), a gamer would be forgiven for feeling a certain sense of déjà vu.

See, even though LLMs run on massive cloud servers, they use dedicated GPUs to do all the training they need to run. Usually this means feeding a downright obscene amount of data through neural networks running on an array of GPUs with advanced tensor cores, and this not only requires a lot of power, but also requires a lot of real GPUs to do at scale.