Why I haven’t upgraded my GPU in almost three years – Smart Fone Video Blog

I have an RTX 3060 Ti in my desktop today, and I’ve had it for nearly three years, which is the longest amount of time I’ve ever stuck with a GPU. After getting my very first GPU (the R9 380 4GB) in 2015, I upgraded it about every two years: to an RX 480 in 2016, an RX Vega 56 in 2018, a GTX 1080 Ti in 2020, and an RTX 3060 Ti in 2021. With the completion of the RTX 40 and RX 7000 series, I have quite a few options for new cards, too.

But while I still feel like upgrading from time to time just for the fun of it, I’ve never followed through on it ever since I bought my 3060 Ti in January 2021. I’m not particularly attached to my 3060 Ti (I really should replace it since it has fire damage, no joke), but I just can’t help but balk at current graphics cards today due to a few pretty big dealbreakers.

Performance gains aren’t good enough, and features don’t make up for them

Source: Nvidia

I know I’m beating a dead horse, but I’m a pretty budget-conscious person, and I just can’t stomach the idea of needing to spend at minimum $500 for a graphics card that can actually provide a significant boost in performance compared to my 3060 Ti. I spent about $550 for my Vega 56 in 2018, and it was nearly twice as fast as my RX 480, which I had purchased for $300 at launch. But today, I’m looking at a $600 purchase for the RTX 4070 (or maybe $500 for the 7800 XT) to replace my $400 3060 Ti, and I’m only getting about 30%-40% better performance in games.

Of course, the great counterpoint is the features that Nvidia (and, to a certain extent, AMD) promise for new GPUs. DLSS 3 quadruples your framerate, you get better ray tracing performance with the RTX 40 and RX 7000, and you also get AV1 coding on some GPUs. However, I play quite a few older games like Skyrim, GTA V, and The Witcher 3, so these new features don’t matter to me all that much.

It takes longer for new generations to launch, we receive worse returns on performance and efficiency improvements, and all we really have to show for it are some upscaling features.

Of course, The Witcher 3 just happens to be a game that recently got an update to include new features like ray tracing and DLSS. So I once again hopped into a game I’ve already put 350 hours in. Needless to say, I wasn’t too impressed. The new DX12 version of The Witcher 3 has worse performance than the original DX11 version, so DLSS is kind of necessary, and I can’t tell the difference between ray tracing at max settings versus disabling them entirely, apart from my framerate being less than half.

To be clear, I’m not saying ray tracing and DLSS are useless. I only tried them in one game. But a faster GPU will boost performance in virtually every game with a pinch of optimization, while these new features only matter if you play the games where they’re supported. Today, only about 300 games support at least one upscaler, with even fewer supporting ray tracing. It’s just not for me, and my 3060 Ti isn’t nearly slow enough to get me to bite the cost of upgrading.

Both RTX 40 and RX 7000 have critical drawbacks

AMD Radeon RX 7800 XT render.

Source: AMD

Both the RTX 40 and RX 7000 series have aspects I like. RTX 40 cards are super power-efficient, fast, and have the latest features, while RX 7000 GPUs offer better value, more VRAM, and arguably better driver software. My least favorite thing about my 3060 Ti has anything to do with the software. Nvidia Control Panel is horribly outdated, GeForce Experience is annoying, and undervolting (which I prefer to do rather than overclocking) is tedious on Nvidia cards. I’m also worried that new RTX 40 cards just don’t have enough VRAM for the long term.

The grass isn’t greener (or redder) on the other side. When AMD announced its RX 7800 XT, I’ll admit I got pretty excited. It’s a $500 GPU with 16GB of VRAM, and AMD claims it’s quite a bit faster than the RTX 4070. Even if it’s only just as fast, its $100 discount is definitely enough to make it one of the best current-gen gaming GPUs. But as I was looking at the spec sheet, I noticed something I didn’t like all too much: the TDP. At over 250W, the 7800 XT really pushes the envelope for what I can tolerate in a gaming GPU since my PC uses a custom liquid loop with just a 240mm radiator, which is the only thing that fits.

While these might seem like minor issues, they’re much more important to me now that new graphics cards aren’t bringing as much to the table as they used to. Even if I had all the money in the world, the fact that improvements are slowing down is also a fundamental issue when it comes to the idea of upgrading my GPU.

Desktop graphics cards just aren’t as exciting as they used to be

NVIDIA GeForce RTX 4090 AMD Radeon RX 7900 XTX

When Nvidia launched its RTX 20 series nearly five years ago, it was a turning point for GPUs. It was supposed to be the dawn of a new era for gaming graphics, the future of how we’d see and play our games. It promised to be the most innovation we’ve ever seen, something more interesting than just the ability to enable more graphics settings than before.

But as it turned out, things have gotten much less innovative. It takes longer for new generations to launch, we receive worse returns on performance and efficiency improvements, and all we really have to show for it are some upscaling features in a few hundred games and ray tracing in even fewer. Nvidia literally ditched GTX for RTX, and it has shown DLSS far more love than ray tracing. AMD and Intel, meanwhile, are just trying to catch up.

Maybe I’m in the minority here. All I know is that I used to get really excited about gaming GPUs, and now I don’t as much. Truthfully, it’s Intel graphics, not Nvidia or AMD, that’s been the most interesting to me right now. Obviously, I’m not swapping out my 3060 Ti for an A750, but maybe the upcoming Battlemage cards will respark my excitement for gaming GPUs.

** (Disclaimer: This video content is intended for educational and informational purposes only) **

By puertoblack2003