Posted in

CPU bottlenecks are holding back modern GPUs more than you think

The topic CPU bottlenecks are holding back modern GPUs more than you think is currently the subject of lively discussion — readers and analysts are keeping a close eye on developments.

This is taking place in a dynamic environment: companies’ decisions and competitors’ reactions can quickly change the picture.

If you’re the kind of person who spends well over $1,000 on high-end GPUs, you probably expect a massive jump in performance across every game you play after each upgrade. I mean, that’s the whole point of splurging on something like the RTX 4090 or 5090, isn’t it? You’re not just chasing higher frame rates; you’re also expecting your games to feel smoother and more responsive, especially if you’re gaming on a 240Hz or 360Hz monitor. But if you’ve ever upgraded your GPU only to be underwhelmed by the FPS uplift, you’re not alone.

I face this issue myself as someone who has paired the RTX 4090 with a 5800X3D. And before you jump to conclusions, this CPU was one of the best options on the market when Nvidia launched the card in 2022. Sure, at 4K, this isn’t a big deal unless I’m playing competitive titles, but at 1440p, the CPU limitations are impossible to overlook. Likewise, you could pair a 9800X3D with the RTX 5090 and still have your GPU underutilized more often than you’d expect. And that’s what I want to talk about here.

You can’t cut corners if you’re spending over $1,000 on a GPU.

As much as we like to think that flagship GPUs like the RTX 4090 and 5090 are for 4K gaming, many competitive gamers do pair them with 1440p ultra-high refresh rate monitors. On paper, that makes sense because you’ve got more than enough GPU power to easily push past 200FPS to have an edge in fast-paced titles. But this is exactly where CPU limitations kick in, and your GPU ends up being underutilized. Even if you have the fastest CPU available, you’ll see your GPU usage often dip well below 90% at these frame rates.

Chasing flagship GPUs for 1440p gaming isn’t really worth your money, even if you have a CPU like the 9800X3D. Take a look at the RTX 5090, for example. according to the data benchmarks, it’s only about 12% faster on average than the RTX 4090 at 1440p. That’s nowhere near the 27% jump you get at 4K. And this gap is a result of CPU limitations, so all that extra GPU horsepower just doesn’t translate into real-world performance gains. You’re essentially paying for performance that your current CPU can’t unlock.

There’s no doubt that most modern games tax GPUs so hard at 4K that you’ll need a high-end card like the RTX 4090 or 5090 to maintain playable frame rates unless you want to rely on upscaling or frame generation. In those situations, your GPU is doing exactly what you paid for, which is pushing visuals as far as possible while holding steady performance. But that only really applies to AAA titles that are naturally GPU-intensive, so your CPU doesn’t matter as much. You can even get away with using a CPU from a couple of generations ago and still get similar results.

That completely changes in competitive games, even at 4K. The moment you start chasing higher frame rates instead of visual fidelity, your CPU becomes far more important than most people expect. These games are designed to run at 200FPS or more, which means your GPU often has plenty of headroom while your CPU struggles to keep up. When I play competitive titles like Valorant and Counter-Strike: 2, my RTX 4090’s usage rarely exceeds 80% at native 4K when paired with the 5800X3D, which tells you everything you need to know about where the bottleneck actually is.

You could argue that most gamers aren’t playing at 200+FPS, and I totally get that. The vast majority of people who get the RTX 4090 or 5090 mostly play AAA games at 4K, where the GPU is doing most of the heavy lifting. In those games, you’re rarely getting over 100FPS unless you have upscaling or frame generation enabled anyway, so even a CPU that’s a few years old isn’t going to noticeably affect its performance.

However, there are people who enjoy high refresh rate gaming just as much as I do. After all, 240Hz and 360Hz OLEDs are popular these days for a reason. The moment you step into that territory, you’re not limited by how fast your GPU can render frames, but by how quickly your CPU can keep up. Sure, you could get the fastest CPU available today to minimize potential bottlenecks, but even then, you’ll still run into scenarios where your GPU isn’t fully utilized. At that point, you’re just sitting there wondering why you splurged on a 1440p/360Hz monitor and a flagship GPU when your CPU is holding everything back.

We’ve reached a point where throwing more GPU horsepower at a game doesn’t automatically translate into a better gaming experience, especially if you’re not playing at 4K. At 1440p and high refresh rates, the CPU decides how much of that performance you actually get to use. That’s why CPUs need to catch up, not just in raw performance, but in how well they can keep flagship GPUs fully utilized across different gaming scenarios. Until that happens, I don’t think even an RTX 6090 would help gamers get the most out of their ultra-high refresh rate monitors.