We are currently witnessing the death of the graphics card in real time, and I couldn’t be happier about it

A little over two years ago, I wrote about how integrated graphics were the future of gaming. I stand by what I said in that article – if anything, recent developments in the computer hardware industry have proven me right and further convinced me that we are witnessing the slow death of the graphics card.

That’s right: I think the dedicated GPU is going the way of the dodo. It’s almost a heretical statement as a seasoned PC gamer and system builder; I have over 500 games on Steam alone and have built a ridiculous number of PCs over the years, both for work and personal use. I’m not afraid of being crucified by Reddit for saying I both believe and heap that GPUs will become extinct, but I would honestly understand if they did. It’s a dramatic statement to make.

The best graphics card is your number one priority when it comes to building a gaming PC, almost always the most expensive component in your system, and it’s a common desire among PC gamers to have a fully specced liquid-cooled build with an RTX 4090 at its center. So why am I so convinced that we won’t be needing them anytime soon?

The big graphic shake-up

The answer to that question requires two separate parts: a look at the CPU industry, and a look at the AI ​​industry. As anyone who knows me well will tell you, I think AI is pretty suspect, so let’s start with the CPU side of the story.

Earlier this year, we saw the triumphant arrival of Qualcomm’s Snapdragon X Elite chips at Computex 2024. A new challenger in the laptop processor arena, something to finally challenge Intel’s dominance of the market – something AMD has failed at for years. It was a strong showing for the new chips, but the part that stuck with me the most was seeing an ultrabook without a graphics card running Baldur’s Gate at 4K.

Now I can see Gale’s beautiful face on a thin and light laptop without a dedicated GPU. Welcome to the future! (Image credit: Larian Studios)

Yes, CPUs with integrated graphics are getting better and better, even though Qualcomm itself claims it has no real plans to take over the gaming market. It’s not all about Snapdragon; Intel plans to hit back with powerful gaming performance in its upcoming Lunar Lake chips, and AMD has had huge success with its custom chips for PC gaming handhelds like the Asus ROG Ally X, Lenovo Legion Go, and Valve’s Steam Deck. Sure, these chips can’t compete with the best 4K graphics cards when it comes to high-end gaming, but they are more then able to provide a solid gaming experience.

There’s a big reason why integrated gaming is actually feasible now, and that’s software scaling. Tools like Nvidia DLSS, AMD FSR, and Intel XeSS make this performance possible; my colleague John Loeffler saw an Asus ZenBook with an Intel Lunar Lake chip average 60 fps in 2024 at IFA 2024 Cyberpunk2077 at 1080p on medium settings thanks to XeSS – a notoriously demanding game.

It’s all about AI

XeSS and DLSS (but notably not AMD’s competing FSR upscaler) are powered by AI hardware, which leads me nicely into my next point: AI is killing the gaming GPU industry, and if it continues at its current pace, it threatens to swallow the industry whole.

Nvidia has been making waves in the AI ​​space for a while now. While a potential slowdown in AI expansion caused Nvidia shares to drop last week , the company remains committed to its AI vision: CEO Jensen Huang’s Computex keynote was packed with AI-related plans that may or may not destroy the planet, and the company continues to release new AI-powered tools and ship hardware for training AI models around the world.

Jensen Huang keeps talking about AI and I can’t blame him: Nvidia has made a LOT of money off of it. (Image credit: Nvidia)

Jensen isn’t alone, either. Earlier this week, AMD Senior VP Jack Huynh revealed in an interview that AMD is seriously taking aim at the AI ​​market , and one knock-on effect of this is that Team Red is pulling out of the high-end GPU race, so we likely won’t be getting a Radeon RX 8900 XTX, at least not anytime soon. Instead, AMD’s consumer efforts will focus on the budget to midrange segment, further closing the performance gap between its discrete graphics cards and the new integrated processor graphics (iGPUs).

A shameful end for the humble graphics card?

Simply put, the increasing demand for GPUs for AI projects is incompatible with a future where GPUs are a necessity for gaming PCs. It’s been clear for a while now that the focus is no longer on consumer hardware (especially for Nvidia), but with iGPUs improving faster than traditional graphics cards, I wouldn’t be surprised if RTX 5000 is the last generation of Nvidia GPUs aimed at gamers.

After all, nothing lasts forever. Sound cards and network adapters were integral to custom PC builds for years, but they were eventually phased out as motherboards improved and began to incorporate those features. When it comes to the demands of the average gamer, we’re probably not far off from CPUs that can handle everything you throw at them – even if that means gaming at higher resolutions.

I also won’t cry for the dedicated GPU if it fails. They’re not the only ones terribly expensive, but if I could improve my gaming performance by simply swapping out a single chip, future system upgrades would be faster and easier, and more compact PC builds would be possible. Yeah, I love my big RGB-filled tower, but it takes up too much damn space on my desk.

Related Post