Leaker disputes $2,500 RTX 5090 pricing, says increase over RTX 4090 likely minimal

midian182

Posts: 10,235   +138
Staff member
Rumor mill: We've heard several rumors regarding Nvidia's RTX 5000 Blackwell series of graphics cards, but one element that is light on details is the pricing. A recent claim about the RTX 5090 stated that it would cost close to $2,500, but a more reliable leaker says this is untrue. According to their prediction, the flagship's price increase over the RTX 4090 won't be significant.

It was announced last week that Nvidia boss Jensen Huang will be delivering the company's CES 2025 opening keynote, marking the first time he's taken over presenting duties from Nvidia SVP Jeff Fisher since 2019. It suggests that Team Green will have something big on show, likely the RTX 5070, RTX 5080, and RTX 5090.

We've seen quite a few claims about the Blackwell cards' specs, but price details have been light. Last week, YouTube channel Moore's Law is Dead (MLID) said the RTX 5090 will be between $1,999 and $2,499, with the MSRP likely closer to the latter figure. If true, that would make it almost $1,000 more than the RTX 4090's $1,599 MSRP when it launched in November 2022.

Thankfully, MLID doesn't have the best reputation when it comes to these sorts of claims. A more reliable source, kopite7kimi, was asked for their take. The prolific leaker said there won't be a "significant" price increase for the RTX 5090.

Nvidia is certainly no stranger to releasing cards with comically expensive price tags. The RTX 3090 Ti carried a massive $1,999 MSRP when it arrived in January 2022. There was also the Titan RTX, which cost $2,499 in 2018 – though it was designed more for AI researchers, content creators, and data scientists.

Nvidia faced plenty of backlash over the way it priced (I.e. too high) the RTX 4000 Lovelace series, so one would imagine it will avoid the same mistakes with Blackwell. However, AMD has confirmed that its RX 8000 series won't be competing with its rival's top-tier next-gen cards, meaning the RTX 5080 and RTX 5090 will likely go unchallenged. That could give Nvidia the confidence to price its new GPUs higher than expected, knowing that those consumers who always demand the best products will be willing to pay. There's also the fact that around 78% of Nvidia's revenue comes from its data center business.

Kopite7kimi recently said that the RTX 5090 will include 32GB of GDDR7 VRAM on a 512-bit memory bus, feature 21,760 CUDA cores, and consume 600W. The RTX 5080, meanwhile, could have 16GB of VRAM, 10,752 CUDA cores, and a 320W TDP.

Permalink to story:

 
Going with 5nm again should reduce prices, however GDDR7 won't.
I still expect 5090 to be 1999 dollars minimum. 1999 or 2199 dollars. 2499 would be too crazy I think.

I will be surprised if 5080 is less than 1000 dollars. I expect 1199 dollars like 4080 on release. None of these two cards will have any competition whatsoever, so why should Nvidia price them low? AMD will have nothing in this bracket.

Lets hope AMD can compete with 5070 this generation. If even 5070 beats top Radeon 8000 card, then it is not looking good.
 
I don't care what Nvidia prices their top teir cards at. I might sound absurd for saying this, but the 4090 might have been the only reasonably priced card in 40 series lineup. So whatever the 5090 is priced at, let it be priced that high. If the 5090 is as good as the 4090 was then I don't have an issue with a flagship, dominating card being priced at whatever NV wants it priced at.

My issue is with the mid ranged and mid-highend. The last flagship card I had was the 1080ti. Currently on a 6700xt, it plays my games perfectly fine at 4k in the 90FPS range(ESO and EvE).

I'm in a unique position where playing modern games at max settings would be "nice" but not a necessity.

It feels like we are in a bit of a dark time in gaming. AMD had more wiggle room to compete on price with the 7000/40 series, but nVidia priced their stuff so high that AMD ended up raising the cost of their cards instead of having to compete hard on price. The 4070 was not a $600 card and the 7800xt was not a $500 card. Both of them were priced about $100 too high for what they offered.

So let's hope that dollar per FPS comes down instead of staying linear like it did for the 30 and 40 series.

As a final note, now that I've migrated away from windows, nVidia cards aren't even an option for me anymore. I think I'll have to see what the 8800xt looks like, but I think I'm gonna have to cozy up to my 6700xt for another generation. To be honest, I think I finally might make the move to OLED rather than buy a new GPU in the next 12-18 months
 
The 4090 last moth was selling as low as $1699 and sometimes slightly lower.
Currently you can find it with supply eol at $1799 to $1899.

Also $1999 would be significant expensive than what the average price was hovering at and what it is selling for now $1799 to $1899.1000051293.jpg1000051292.jpg1000051298.jpg
 
Last edited:
I don't care what Nvidia prices their top teir cards at...
You're right. Tech forums always have this disconnect, where many people don't realize how minuscule the market for high-end components is. The vast overwhelming majority of the market is on the mid-range, and the 60-tier cards outsell the high-end cards by an order of magnitude at least, even when they're awful like they currently are.

Cards like the 4090 and 5090 make zero impact on 99% of the consumer market. What makes or breaks this generation is how good the 5060 (and to a lesser extent the 5070) will be.
 
I can't see it going above $2000.

If it does stick nearer the $1600 mark that gives some hope for rational pricing of the lower cards.

If I were to guess I'd say that would put the 5080 at $1000. It splits the difference between what it should be ($800) and the previously poorly received $1200. At a grand, I think enough people will begrudgingly pay it to maximize total profit (rather than profit/card).

The 5070 could go a lot of ways. Depends on how cut down the card is and if they want a 5070 TI right out of the gate.
 
QUOTE
Nvidia faced plenty of backlash over the way it priced (I.e. too high) the RTX 4000 Lovelace series, so one would imagine it will avoid the same mistakes with Blackwell.
UNQUOTE

I don't see why pricing the 5090 at $2.5k would be a mistake @Techspot.

1. AMD is not competing at the 5090 tier, so there is no competition.

2. the 90 series GPUs target whales for whom money isn't a concern.

If NVIDIA DO NOT price the 5090, a card that targets ppl with more money than common sense, at $2.5K when there is no competition at all then they would be selling themselves short and they will be making a mistake.

The kind of buyer that usually buys a -90 series GPU will still gladly buy it whether it costs $2.5k, $3k or $4k, therefore NVIDIA should press their advantage and skin the fools.
 
The 5090 is probably cutdown units sent out that were rejected for B200 server GPUs. This is why there will likely never be a full die 5090 Ti, just like there wasn't a 4090 Ti. The price might be set at $1799, but they will still sell for > $2000 after the initial batch is sold through. They will also remain scarce as Nvidia will not cutdown good dies they can make exponentially more money off of, 5090 might be harder to get than 4090 with the demand and better yields of current 4nm wafers.
 
You're right. Tech forums always have this disconnect, where many people don't realize how minuscule the market for high-end components is. The vast overwhelming majority of the market is on the mid-range, and the 60-tier cards outsell the high-end cards by an order of magnitude at least, even when they're awful like they currently are.

Cards like the 4090 and 5090 make zero impact on 99% of the consumer market. What makes or breaks this generation is how good the 5060 (and to a lesser extent the 5070) will be.
My current recommendation for people looking for an upgrade is to go OLED if they haven't already. I think prices have gotten to the point where money is better spent on a display than a graphics card. You can get a nice OLED monitor for the cost of a midranged card. Burn in isn't the issue it once was, either.
 
The 5090 is probably cutdown units sent out that were rejected for B200 server GPUs. This is why there will likely never be a full die 5090 Ti, just like there wasn't a 4090 Ti. The price might be set at $1799, but they will still sell for > $2000 after the initial batch is sold through. They will also remain scarce as Nvidia will not cutdown good dies they can make exponentially more money off of, 5090 might be harder to get than 4090 with the demand and better yields of current 4nm wafers.
And why do you care? Nvidia can use cutdown dies and still beat the competition easily. Not all their cores are cutdown anyway.
 
2. the 90 series GPUs target whales for whom money isn't a concern.
The 4090 and I'm guessing the 5090 exist because not all 102 chips will make the cut for AI servers with fully functional dies. So, yes, target the crowd that can afford ultra elite gaming machines.
 
My current recommendation for people looking for an upgrade is to go OLED if they haven't already. I think prices have gotten to the point where money is better spent on a display than a graphics card. You can get a nice OLED monitor for the cost of a midranged card. Burn in isn't the issue it once was, either.

No but text clarity is on early gen panels, especially at 1440p and full screen brightness + HDR peak brightness is still gimped compared to OLED TVs. Even 4K OLED monitors are lacking DP 2.1 UHBR80 except a Gigabyte monitor.

If you go 4K you need a high-end GPU really, or use upscaling, and here DLSS wins. DLSS is better than FSR in so many ways and has way better widespread support as well. Most new games has DLSS, especially demanding ones.
 
No but text clarity is on early gen panels, especially at 1440p and SDR peak + HDR peak brightness is still gimped compared to OLED TVs. Even 4K OLED monitors lacking DP 2.1 UHBR80 except a Gigabyte monitor.
FFS, most people don't even use all those features and we aren't talking early gen OLED panels. I'm telling people that a $500 OLED will give them a better gaming experience than a $1000 GPU.

This "the best or nothing" mindset is exhausting. Too many people don't remember the dark times when people transitioned from CRTs to LCDs.

Life is made of compromises and if you have enough money to not compromise then you have no business being in a conversation a about compromises.
 
FFS, most people don't even use all those features and we aren't talking early gen OLED panels. I'm telling people that a $500 OLED will give them a better gaming experience than a $1000 GPU.

This "the best or nothing" mindset is exhausting. Too many people don't remember the dark times when people transitioned from CRTs to LCDs.

Life is made of compromises and if you have enough money to not compromise then you have no business being in a conversation a about compromises.
Most people don't use text on their monitor?

WHy buy a new monitor if you don't look at HDR performance. HDR is already a gamechanger.

You don't get a quality OLED monitor for 500 dollars. You get a low nit 1440p OLED monitor.
 
FFS, most people don't even use all those features and we aren't talking early gen OLED panels. I'm telling people that a $500 OLED will give them a better gaming experience than a $1000 GPU.

This "the best or nothing" mindset is exhausting. Too many people don't remember the dark times when people transitioned from CRTs to LCDs.

Life is made of compromises and if you have enough money to not compromise then you have no business being in a conversation a about compromises.
OLED is a nice upgrade if you have an older monitor, but I'm not sure the difference is enough for people with a good, relatively recent, non-OLED panel. But I only use ultrawide so I could be missing something in 16:9 land.
 
OLED is a nice upgrade if you have an older monitor, but I'm not sure the difference is enough for people with a good, relatively recent, non-OLED panel. But I only use ultrawide so I could be missing something in 16:9 land.
I've been anti OLED for a long time but it's at the point where they're cheap enough that you can just replace them after 2-3 years if burn in becomes an issue. And due to how OLEDs work, I've used 60hz OLED panels that feel like 120hz+ LCD. There is essentially no switching time on an OLED so "gray to gray" response times are irrelevant.

If you just bought a nice LCD then I'd say keep it, but I can sit down in front of a display and instantly tell if it is an OLED or not.

I'm in a unique situation as I use a 65"4k TV as a monitor. I bought a top of the line Samsung 5 years ago and now when I go into a store, it isn't even a contest. The OLEDs in that category are now cheaper than what I paid for my QLED(hate that term) 5 years ago.
 
I've been anti OLED for a long time but it's at the point where they're cheap enough that you can just replace them after 2-3 years if burn in becomes an issue. And due to how OLEDs work, I've used 60hz OLED panels that feel like 120hz+ LCD. There is essentially no switching time on an OLED so "gray to gray" response times are irrelevant.

If you just bought a nice LCD then I'd say keep it, but I can sit down in front of a display and instantly tell if it is an OLED or not.

I'm in a unique situation as I use a 65"4k TV as a monitor. I bought a top of the line Samsung 5 years ago and now when I go into a store, it isn't even a contest. The OLEDs in that category are now cheaper than what I paid for my QLED(hate that term) 5 years ago.

So you have been anti OLED because you can't afford OLED? I have been using OLED since 2018 and have yet to see burn-in of any kind, even with peaked brightness.

My last 10 phones had OLED, if not more. Zero problems.

If burn-in was a huge problem for OLED, you would not see every high-end product go OLED these days. Samsung Display and LG Display both have 100% focus on OLED for now. Building new OLED factories as well, to meet demand. They both abandoned LCD production and buys 3rd party LCD panels now (Samsung Elecronics and LG Electronics).

OLED is the future till micro LED is ready for consumers, in 5-10 years.

LCD is used in mid-end stuff at most today. There's some gimmicky high-end LCD offerings but they are waste of money as OLED beats it. LCD will needs 1000s of dimming zones to even come close to OLED. Can't beat per pixel control and infinite constrast + perfect viewing angles.

However, text clarity is an issue for gen 1-2 OLED panels for 1440p monitors. 3rd gen panels with QD-OLED or WOLED with MLA are great options, but far from 500 dollars more like 1000 dollars.

All 1440p OLED Ultrawides use gen 1 panels. Even the brand new ones.

Most 1440p OLED uses the old panels as well. Still have not seen any 1440p OLEDs use gen 3 panels.

Gen 3 has much better pixel layout for text. Brighter and more durable. So don't cheap out on an OLED monitor or you will probably end up with a gen 1 panel, which still will beat LCD in most cases, but have several cons compared to gen 3+
 
You're right. Tech forums always have this disconnect, where many people don't realize how minuscule the market for high-end components is. The vast overwhelming majority of the market is on the mid-range, and the 60-tier cards outsell the high-end cards by an order of magnitude at least, even when they're awful like they currently are.

Cards like the 4090 and 5090 make zero impact on 99% of the consumer market. What makes or breaks this generation is how good the 5060 (and to a lesser extent the 5070) will be.

I know like 10 gamers IRL and none of them are using GPUs less than 500 dollars right now so I don't really know where you got that from. Personal experience? Most of them buy in the 600-1200 dollar range and several are looking to pick up a 5080 once it hits.

What you are saying is true tho, for AMD GPU buyers. They rarely spend more than 200-300-400 dollars. Simply just look at Steam HW Survey and you will see that AMDs most popular cards are the cheap ones. Meanwhile tons of highend Nvidia SKUs are listed in the top 25 and Nvidia dominates this list completely. I don't even think AMD has one dedicated GPU listed here, only iGPUs like Intel.
 
So you have been anti OLED because you can't afford OLED? I have been using OLED since 2018 and have yet to see burn-in of any kind, even with peaked brightness.

My last 10 phones had OLED, if not more. Zero problems.

If burn-in was a huge problem for OLED, you would not see every high-end product go OLED these days. Samsung Display and LG Display both have 100% focus on OLED for now. Building new OLED factories as well, to meet demand. They both abandoned LCD production and buys 3rd party LCD panels now (Samsung Elecronics and LG Electronics).

OLED is the future till micro LED is ready for consumers, in 5-10 years.

LCD is used in mid-end stuff at most today. There's some gimmicky high-end LCD offerings but they are waste of money as OLED beats it. LCD will needs 1000s of dimming zones to even come close to OLED. Can't beat per pixel control and infinite constrast + perfect viewing angles.

However, text clarity is an issue for gen 1-2 OLED panels for 1440p monitors. 3rd gen panels with QD-OLED or WOLED with MLA are great options, but far from 500 dollars more like 1000 dollars.

All 1440p OLED Ultrawides use gen 1 panels. Even the brand new ones.

Most 1440p OLED uses the old panels as well. Still have not seen any 1440p OLEDs use gen 3 panels.

Gen 3 has much better pixel layout for text. Brighter and more durable. So don't cheap out on an OLED monitor or you will probably end up with a gen 1 panel, which still will beat LCD in most cases, but have several cons compared to gen 3+
I have no idea what is up with all the "you're just poor" accounts showing up lately but I'm just going to not reply to them. All I will say is that I'm an adult and the tools in my work truck cost more than a new work truck
 
PC gaming is dead. Just buy a PS5 Pro. Games are, at least, optimized for consoles.

Buying a 4090 to play at 1080p 30FPS with RT is ludicrous...
 
YouTube channel Moore's Law is Dead (MLID) said the RTX 5090 will be between $1,999 and $2,499, with the MSRP likely closer to the latter figure.
If it's the same video I saw, his analysis was that Nvidia reps had been testing that message with various industry participants, which he took as a trial balloon, and was skeptical of it flying (esp. the closer to $2,499 part.) That's different than saying final pricing had been decided.
 
I know like 10 gamers IRL and none of them are using GPUs less than 500 dollars
You see, this is what we refer to as a "worthless anecdote". Reality does not revolve around the 10 people that some random nobody on the internet knows.

What you are saying is true tho, for AMD GPU buyers. They rarely spend more than 200-300-400 dollars. Simply just look at Steam HW Survey
If you know about the Steam survey, why are you asking me where I got my idea from?

Look at the Steam survey, genius. The marketshare for xx60 models completely dwarfs the marketshare for more expensive cards. Hell, there are more 8-year-old GTX 1060s alone still in use today (2.75%) than all $1000+ cards added together (3090 + 4080 + 4080 Super + 4090 = 2.63%).

Meanwhile tons of highend Nvidia SKUs are listed in the top 25
First of all, no, there aren't "tons" of high-end Nvidia SKUs in the top 25. Depending on how you define high-end, only the RTX 3080 (launched at $700 4 years ago but has been available for much cheaper since), 4070 (launched at $600 but is $500-ish now), 4070 Super and 4070 Ti could be considered high-end, and that's 4 out of 25. Meaning 21 of the top 25 (84%) are NOT high-end cards.

Second, why "top 25"? Did you notice that the top 10 contained zero high-end cards, and then had to expand it to some arbitrary large number so your argument wouldn't immediatelly fall apart, while hoping nobody would call you out on it?

I have no idea what is up with all the "you're just poor" accounts showing up lately but I'm just going to not reply to them. All I will say is that I'm an adult and the tools in my work truck cost more than a new work truck
If someone writes that a product is "mid-end" without realizing how ridiculous that is, you can make assumptions about their level of intelligence and safely disregard everything they say.
 
Tom from MLID never said that Nvidia would likely charge close to 2500, and contrary to what is claimed, he and his channel is actually a very reliable source for leaks.

Tom of MLID provides lots of context and caveats to information that he leaks, and this article is blatantly misrepresenting his channel and attacking his channel without any reasonable justification.
 
So, yes, target the crowd that can afford ultra elite gaming machines.
I'd like to believe that a lot of those sales are going to people who have non-gaming needs for say the extra VRAM or where the performance makes a real difference to their workflow. But I admit I have no data one way or the other.

I also agree with the earlier comment that in some ways it feels like the 4090 was the only well-priced 40-series card. For me that meant I didn't buy any of them.
 
I can't see it going above $2000.

If it does stick nearer the $1600 mark that gives some hope for rational pricing of the lower cards.

If I were to guess I'd say that would put the 5080 at $1000. It splits the difference between what it should be ($800) and the previously poorly received $1200. At a grand, I think enough people will begrudgingly pay it to maximize total profit (rather than profit/card).

The 5070 could go a lot of ways. Depends on how cut down the card is and if they want a 5070 TI right out of the gate.

I agree regarding it not going over 2000, at least for the MSRP (some models definitely will be more than 2000, but it's common for many models to be around 200-600 higher than MSRP). Tom from MLID never stated that an MSRP closer to 2500 was likely, not even in his opinion, and he definitely didn't claim that any source was telling him that the MSRP was likely to be close to 2500, only that Nvidia was considering the possibility of pricing it close to that figure, and he stated that he thinks Nvidia might be deliberately leaking that info just to see how people react to that kind of price.

Tom of MLID is a very reliable source for leaks, and has been misrepresented by the author of this article, (Rob Thubron). Many of his leak reports turn out to be extremely accurate. The context and caveats Tom provides with leaked information are very carefully worded, and he very frequently reminds people that even if leaked information accurately represents what the source believes to be true, plans can change at the last minute, especially for pricing. He often states when his source is only making an educated guess. If that guess turns out to be wrong, that doesn't make MLID an unreliable source of information, because he was only reporting on what one person at a company was guessing anyway, and DIDN'T make the mistake of misrepresenting that guess as what the company is definitely going to do.
 
Back