Don't Buy a GPU Now: Wait for Next-Gen or Grab a Discount?

You are ignoring the elephant in the room, the incoming us tariffs. They will probably change the performance per dollar by a large percentage. Therfore buying a black Friday card isn't a bad idea...

Quit reading media. The tariffs are a bargaining chip. I doubt we will see any change in prices. You people need to look at 2016-2020. He's already served 4 years as president and the tariff threats actually kept inflation in check and brought down prices.
 
Once we get this close I usually tell people to wait unless you absolutely need a new one in the instance of a GPU failure. I have been buying lots of used hardware for years to the point of it becoming a problem. You can snag tons of deals on used hardware for the first few months after the release of a new product. Unless you want a warranty, which in many cases are worthless these days, there is very little to go new if the used product is in good condition.
 
I agree with almost everything in the article, other than making a distinction between when new information will be available - which likely will happen in January - vs. when new cards will actually be available at MSRP for most purchasers - which likely will not be any time soon.

I do not have a Microcenter near by so I expect my online shopping options to be full of sold out and/or scalper prices for potentially months to come. I guess if Nvidia keeps lowering the value proposition another option may be shelves that do have inventory, at pre-scalped prices.
 
I agree with almost everything in the article, other than making a distinction between when new information will be available - which likely will happen in January - vs. when new cards will actually be available at MSRP for most purchasers - which likely will not be any time soon.
Subtle but important distinction. This will likely depend on factors such as real availability and if some of the cards are good value/are well priced, then demand will be a factor, too, which has not been the case for some releases of the current generation because well, they sucked.
 
Yep, agree with this article. Question is though: should one buy a GPU soon after they announce them at CES, or wait until Black Friday/November 2025? My plan is to buy a new monitor this black Friday (since, as another of your articles put it, that will be a game changer in terms of visuals and my monitor today only has fake HDR) and buy a GPU next year, but as others have mentioned, tariffs (or potential for tariffs) makes me wonder if it's better to bite the bullet early. Probably won't really know the answer until after CES.
 
Most people (but not all) do not buy hardware to chase the maximum (theoretical) performance. Instead, they purchase new hardware when their old equipment reaches its limits and begins to hinder their performance on everyday tasks. In the gaming field, this means that most individuals upgrade their GPUs when the older models can no longer sustain 60fps(90+% of monitors refresh rate) at high-ultra settings on a 1080p resolution. A 3060 Ti can achieve this performance.

Therefore, new GPUs are primarily focused on AI; in fact, de facto they are no longer game-related cards. In AI applications, you require a maximum of 20 tokens per second (which is much faster than you can read). If you need a card with 16 GB of VRAM, purchasing the 4060 Ti makes sense because the next models provide more than 20 tokens per second to the model they can load in their limited VRAM, making it unnecessary to pay extra. If you prefer 24 GB, choose the 3090, and for 32 GB, opt for the 5090.

Therefore, de facto there are only three options: the 4060 Ti, the 3090, and the 5090. No Intel or AMD options exist even no options for other NVIDIA cards. Isn't that simple? Just only 3 cards (4060ti 16GB, 3090 24GB, 5090 32GB) to choose.
 
Last edited:
Quit reading media. The tariffs are a bargaining chip. I doubt we will see any change in prices. You people need to look at 2016-2020. He's already served 4 years as president and the tariff threats actually kept inflation in check and brought down prices.
You do have access to the Internet and Google right? Did you know you can actually search for factual information such as inflation charts while Trump was in office? Inflation shot up under Trump.

And you are telling us that one of his main campaign promises is a media lie?

What's it like being part of a cult of ignorance?
 
I bought a 4070 Super in June to avoid the tariffs. Guess what? Biden delayed imposition of the tariffs without even saying why. Until they actually impose a NEW tariff on tech, I would take all these tariff threats with a grain of salt. Trump & Biden just want to "look serious" about restoring American jobs. They actually don't care, and the voters were warned about this.
 
Last edited:
Tim is very much biased towards nVidia. For nVidia cards it needs to be a 25% percent discount, but for AMD it better be 30%? Then favoring the 4070 over the faster 7800XT even if there would be a 15% price diff because of raytracing? How will the 4070 with its gimped 12GB VRAM fare in the comming RT games, when RTX5000 inflates VRAM sizes and RT performance? Surelly not good.
 
Most people (but not all) do not buy hardware to chase the maximum (theoretical) performance. Instead, they purchase new hardware when their old equipment reaches its limits and begins to hinder their performance on everyday tasks. In the gaming field, this means that most individuals upgrade their GPUs when the older models can no longer sustain 60fps(90+% of monitors refresh rate) at high-ultra settings on a 1080p resolution. A 3060 Ti can achieve this performance.

Therefore, new GPUs are primarily focused on AI; in fact, de facto they are no longer game-related cards. In AI applications, you require a maximum of 20 tokens per second (which is much faster than you can read). If you need a card with 16 GB of VRAM, purchasing the 4060 Ti makes sense because the next models provide more than 20 tokens per second to the model they can load in their limited VRAM, making it unnecessary to pay extra. If you prefer 24 GB, choose the 3090, and for 32 GB, opt for the 5090.

Therefore, de facto there are only three options: the 4060 Ti, the 3090, and the 5090. No Intel or AMD options exist even no options for other NVIDIA cards. Isn't that simple? Just only 3 cards (4060ti 16GB, 3090 24GB, 5090 32GB) to choose.

Are you claiming that you have the market data to back all this up? Because otherwise this sounds as common sense, yet as contrary to fact, as asserting that historically consumers bought new iPhones mostly when their last phone could no longer make calls.

Anyway, even if there is a degree of truth as to mainstream purchase patterns, here at an enthusiast site, I think you'll find a great many readers are interested in frame rates over 60, and resolutions greater than 1080p. (potential 9800X3D buyers excluded of course </wink>).
 
Yep, agree with this article. Question is though: should one buy a GPU soon after they announce them at CES, or wait until Black Friday/November 2025? My plan is to buy a new monitor this black Friday (since, as another of your articles put it, that will be a game changer in terms of visuals and my monitor today only has fake HDR) and buy a GPU next year, but as others have mentioned, tariffs (or potential for tariffs) makes me wonder if it's better to bite the bullet early. Probably won't really know the answer until after CES.

Not sure what you're looking for, but there are great deals to be had in the used 4K OLED TV market (if you're okay with 120Hz). Got a pristine C1 last spring for $280. With several C1s and C2s for sale in the low 400s. Awesome upgrade, the speed, viewing angles, and color most noticeable for me, and the true HDR. And with LG burn-in hasn't been a problem. Color shift took some getting used to but nothing's perfect.

As for GPU, we don't have enough info on price/performance yet. I think a 7900XT under $500 is decent (XTX under $700...maybe). Completely agree with the article that current "discounts" just aren't good enough, especially with better price/performance/efficiency coming.
 
Not sure what you're looking for, but there are great deals to be had in the used 4K OLED TV market (if you're okay with 120Hz). Got a pristine C1 last spring for $280. With several C1s and C2s for sale in the low 400s. Awesome upgrade, the speed, viewing angles, and color most noticeable for me, and the true HDR. And with LG burn-in hasn't been a problem. Color shift took some getting used to but nothing's perfect.

As for GPU, we don't have enough info on price/performance yet. I think a 7900XT under $500 is decent (XTX under $700...maybe). Completely agree with the article that current "discounts" just aren't good enough, especially with better price/performance/efficiency coming.
Thanks for the suggestion. I use my monitor for work and gaming since I work from home, so for me an OLED just isn't worth the burn in-risk until more burn-in data comes about, and the latest 9-month report from this site made me queasy about it since I want to keep the monitor for 5+ years, so I am getting an LCD monitor, specifically the 32in Samsung Odyssey Neo G7 that was recommended by this site as the best 4k HDR LCD monitor (in their latest 4k monitor buying guide, separate from their best monitors at every $100 increment guide, which sadly only has OLEDs above $600). Amazon doesn't have much stock (of new monitors) left so I actually pulled the trigger already, but it was already at a decent discounted price ($650). I probably won't get the best discount that happens out there in a few days, but at least I won't be worried about not finding any of that model at all.
 
I'm buying a 5090.
At this point, there's no way I'd buy a 4000 series - regardless the discount.
As for mobile gaming in a laptop, however, I'd take a discounted mobile 4080 or 4090 if the price was right - but I'd still rather wait for the mobile 5000 series. This is an investment into FUTURE tech.
 
It will be hilarious to see the price of gpu's in Amurica in a few months. Even Australia will look good value by comparison. Already Nvidia will be raising 5090/80 prices a lot irrespective of tariffs. Third party have already stated Nvidia was gauging their reaction about $1200-1500 for 5080 and $2-2.5K for 5090.

AMD will be laughing as the RTX 8800XT/8700XT sell up a storm early next year while Nv*****s whip themselves into a frenzy over $1400 5080 and $2200 5090 and that's assuming scalpers don't bag a huge swathe of stock.
 
Quite a muddle we're all in to be fair.

I could say I don't care... got my 7900XTX to last another gen until AMD come back swinging... or not. I'm thinking that the best RDNA4 card won't be enough to upgrade from the 7900XTX over, or that Intel have left themselves too big a gap to bridge since the Alchemist to catch up to Nvidia about to take another stride forward. And Nvidia? Watch MSRP and the old standard pricing range for premium feature models be a bad joke again and that flagship price record get smashed again for the third gen in a row after they made it so by playing us dirty... and AMD nowhere in sight above the mid range to offer 90% as good for half the price this time. Basically back to pre-Ampere days only Nvidia cost 3x more while AMD have been keeping far closer to Turing's per tier and premium price ranges with the gen on gen perf uplift you'd expect up until now.

I'm leaning on the side that reckons we've likely a few tougher years ahead even without someone's tariffs or whatever. But I won't say this isn't a condition that at least half the PC gaming community ensured would happen either.
 
It will be hilarious to see the price of gpu's in Amurica in a few months. Even Australia will look good value by comparison. Already Nvidia will be raising 5090/80 prices a lot irrespective of tariffs. Third party have already stated Nvidia was gauging their reaction about $1200-1500 for 5080 and $2-2.5K for 5090.

AMD will be laughing as the RTX 8800XT/8700XT sell up a storm early next year while Nv*****s whip themselves into a frenzy over $1400 5080 and $2200 5090 and that's assuming scalpers don't bag a huge swathe of stock.
Frankly, I think we're going to see a shortage of their "low end" cards. They don't need to sell them and, as they've showed, they'd rather sell AI chips to the enterprise market.

So they're just gonna sell 5080s and 5090s and then let AMD have the Midranged market with the 8800XT and below
 
You do have access to the Internet and Google right? Did you know you can actually search for factual information such as inflation charts while Trump was in office? Inflation shot up under Trump.

And you are telling us that one of his main campaign promises is a media lie?

What's it like being part of a cult of ignorance?

Interesting take considering the BLS tells a different story. Please click on the 10 year or max graph option and see how "inflation shot up under Trump": SMH Keep in the mind the Fed target is 2.0%. Now who's the "cult member"?

https://tradingeconomics.com/united-states/core-inflation-rate
 
Last edited:
At this point GPU ability is becoming far less important than GPU durability/quality/longevity.
For 1080p medium to high even a 2016-18 GPU like the Titan X Pascal, 1080Ti, and Titan Xp is still plenty; for 4K RTX 2080Ti onward gets the job done as well.
 
My 7900xtx (MERC 310 $1179) is nearly 2 years old now...

I am so glad EVGA dropped out and forced me to look at other options, as the RTX4080 cant touch the XTX in the games I play and many who are in my clan/guild months later sold their 4090 (at a profit) to buy XTX, bcz it either matched, or bettered the 4090 in raster games.

Knowing, even if they gave up a few frames, the trade off was that the XTX didn't warm their whole house up and/or fry stuff. We are all waiting now for the Pro gaming card coming mid 25" using rdna5 and chiplet/HMB design.


We all know that RDNA is just better at gaming now (Consoles/handhelds/etc) than CUDA... which is Ai driven architecture now.



Right now if I was gaming at 1440p and looking to buy a GPU?

I would not hesitate in a moment to get my hands on a discounted 7900XT or XTX. I game at 3840x1600 and get ovr 230+ fps w/XTX, but at 1440p I was hitting/getting 300+ frames.
 
I kinda feel AMD is on the right path for the audience they target. Yes, the 4090 is the fastest GPU at the moment on the market but only a few people are willing to pay that much for a GPU. They are not trying to fight Nvidia with performance but price.

The 4080 is also a nice choice, but in my currency, it costs 6k more than the RX 7900 XTX. I like DLSS but not Raytracing reason why I am gonna get the smaller 3060.
 
Back