Leaker disputes $2,500 RTX 5090 pricing, says increase over RTX 4090 likely minimal

I'd like to believe that a lot of those sales are going to people who have non-gaming needs for say the extra VRAM or where the performance makes a real difference to their workflow. But I admit I have no data one way or the other.

I also agree with the earlier comment that in some ways it feels like the 4090 was the only well-priced 40-series card. For me that meant I didn't buy any of them.
The steam hardware surveys say otherwise. They're not as popular but a lot of them buy those cards to play games.
PC gaming is dead. Just buy a PS5 Pro. Games are, at least, optimized for consoles.

Buying a 4090 to play at 1080p 30FPS with RT is ludicrous...
So instead you buy a ps5 pro for $1100 to play the same games at a dithered 1080p 20 FPS, lose out on decades of past software, and rely entirely on Soyny not turning the store off?

LOL yeah sure bud.
You're right. Tech forums always have this disconnect, where many people don't realize how minuscule the market for high-end components is. The vast overwhelming majority of the market is on the mid-range, and the 60-tier cards outsell the high-end cards by an order of magnitude at least, even when they're awful like they currently are.

Cards like the 4090 and 5090 make zero impact on 99% of the consumer market. What makes or breaks this generation is how good the 5060 (and to a lesser extent the 5070) will be.
This was AMD's attitude back in 2016 with polaris. The result? The 1070, 1070ti, 1080, and 1080ti each individually outsold AMD's ENTIRE polaris lineup, and AMD reversed course 18 months later with Vega.

There's tens of millions of high end gamers buying 70, 80, and 90 tier cards. It's not a miniscule market. That's why nvidia keeps making them.
 
The steam hardware surveys say otherwise. They're not as popular but a lot of them buy those cards to play games.
The steam survey doesn't tell us anything about the primary motivation for purchasing the card. Plenty of people use their PCs for both work and games (and hobbies, which is a 3rd potential reason.)
 
This was AMD's attitude back in 2016 with polaris. The result? The 1070, 1070ti, 1080, and 1080ti each individually outsold AMD's ENTIRE polaris lineup, and AMD reversed course 18 months later with Vega.
Pretending AMD lost marketshare because of Polaris is disingenous. They have been losing marketshare steadly since Kepler launched in 2012, and whether they had a high-end card or not has never made a difference in that trend.

There's tens of millions of high end gamers buying 70, 80, and 90 tier cards. It's not a miniscule market. That's why nvidia keeps making them.
70 tier, sure.

80 and 90 tiers, no there aren't.

If we estimate Steam has about 170 million users (132 million at the end of 2021 was the last official number we got, then increase it by the same proportion daily concurrent peak numbers increased since then: 29.2 million in december 2021, 37.8 million today); and then look at the survey, the 4080, 4080 Super and 4090 together have a marketshare of 2.16%. That means only 3.7 million Steam users bought a 4080+ card.

It's not nothing, but it's a fraction of the 13.6 million that bought a 4060 tier card. Another 9.7 million are using a RTX 3060, which is still being sold. And those are mid-range cards that did not sell well to begin with (compared to the 1060, for example, which alone accounted for 16% of Steam at one point).

If you include past mid-range and entry-level cards, they completely dominate the top of the chart. While if you include past $1000+ cards, you still only get to around 3% total.
 
I'd like to believe that a lot of those sales are going to people who have non-gaming needs for say the extra VRAM or where the performance makes a real difference to their workflow. But I admit I have no data one way or the other.

I also agree with the earlier comment that in some ways it feels like the 4090 was the only well-priced 40-series card. For me that meant I didn't buy any of them.
So, in my instance of gaming (ESO and EvE@4k), I use significantly more vram @4k than I do other resolutions. If people remember back in the voodoo2 days, resolution was heavily vram dependant. You couldn't run 800x600with less than 12mb of vram.

But I'm bring this up because both EvE and ESO use about 9-10GB of vram at 4k. So even though my 6700xt is overkill for those games@4k, I'd be getting heavily bottlenecked with only an 8GB card. You don't need a powerful card to run older games at 4k, but you do need lots of VRAM.

This is a large reason why I feel that even midranged cards should come with more than 8 gigs these days. 1440P has a few more years running comfortably on 8GB of VRAM, but we really need a bump in VRAM. No so much for monitors, but there are lots of people opting in for PC gaming on TVs these days and nearly everything you see in the TV space is 4k these days. I think we're probably 6-7 years away from 8k120 so I really want my next TV to last until 8k120 becomes viable. IIRC, there isn't even a display standard that can handle 8k120 yet.
 
The point has nothing to do with performance, it had to do with supply and demand.
You're right, anecdotal claims by Techspot comments are far more accurate :joy:
Pretending AMD lost marketshare because of Polaris is disingenous. They have been losing marketshare steadly since Kepler launched in 2012, and whether they had a high-end card or not has never made a difference in that trend.
This is patently false. Polaris marked a significant drop in AMD's GPU market share, dropping by nearly half and dipping into the single digits for the first time. You can deny the existence of the high end all you want, its there. They buy. Money speaks louder then your beliefs.
70 tier, sure.

80 and 90 tiers, no there aren't.

If we estimate Steam has about 170 million users (132 million at the end of 2021 was the last official number we got, then increase it by the same proportion daily concurrent peak numbers increased since then: 29.2 million in december 2021, 37.8 million today); and then look at the survey, the 4080, 4080 Super and 4090 together have a marketshare of 2.16%. That means only 3.7 million Steam users bought a 4080+ card.
So my statement was correct, tens of millions do buy 70, 80, and 90 tier cards. I never said individually each one sold tens of millions.

Also, that's only one generation. Most people dont upgrade every generation. How many bought the 3000s? 2000s? Those are still in the survey.
It's not nothing, but it's a fraction of the 13.6 million that bought a 4060 tier card. Another 9.7 million are using a RTX 3060, which is still being sold. And those are mid-range cards that did not sell well to begin with (compared to the 1060, for example, which alone accounted for 16% of Steam at one point).

If you include past mid-range and entry-level cards, they completely dominate the top of the chart. While if you include past $1000+ cards, you still only get to around 3% total.
OK, so the mid range market is big. Guess what/ That doesnt mean nobody buys high end stuff.

This idea of "niche markets cant sell much" is totally false.
 
You're right. Tech forums always have this disconnect, where many people don't realize how minuscule the market for high-end components is. The vast overwhelming majority of the market is on the mid-range, and the 60-tier cards outsell the high-end cards by an order of magnitude at least, even when they're awful like they currently are.

Cards like the 4090 and 5090 make zero impact on 99% of the consumer market. What makes or breaks this generation is how good the 5060 (and to a lesser extent the 5070) will be.

I do agree with you, but this is also a std mega corp marketing gimmick. Apple do it a lot say with Ipads.
It makes a still overpriced lower option seem great value

I've stated I don't even want a 5090 honking monstrosity , nor a 6litre engine, or 120" TV. Those people always want more and enough is never enough

I really hope AMD can hit them hard at the 5060 level.
Plus given the smaller need to PCs, High prices are just a boost for consoles for average Joes/Janes , they can't afford $2000 on a GPU, but they can handle $70 on a new game. nature of people ok income , no savings

What is annoying with Nvidia is not how they price of 5090 etc it's how they purposely trim down say a 5060 to what they can GET AWAY with. Ier for $10 to $20 extra in parts they could give you a much better card
 
This is patently false. Polaris marked a significant drop in AMD's GPU market share, dropping by nearly half and dipping into the single digits for the first time. You can deny the existence of the high end all you want, its there. They buy. Money speaks louder then your beliefs.
Between july 2016 (right before before Pascal and Polaris launched) to july 2018 (2 years later, couple months before Turing launched) AMD usershare fell from 25% to 16%. Neither by half, nor to single digits.

It's also a mistake to attribute this to the existance or non-existance of the high end. High-end Pascal cards were only a small fraction of the sales of the 1000 series cards, and Vega was even less impactful. Pascal sold so much due to how popular the 1050 Ti, 1060 and 1070 were, not simply because the GTX 1080 Ti (which the vast majority of the market had zero intention to buy) existed.

Similarly, AMD also didn't have high-end cards during the HD 4000 and HD 5000 days, and that's when they had their largest marketshare. They did have high-end cards during the HD 6000 and HD 7000 days, but by then their marketshare was declining. They did go back to having a high-end in the RX 6000 days, and it did not help.

There is no observable correlation between AMD having high-end cards and their marketshare, and saying Polaris had low sales only because there were no high-end cards is assuming way too much.

So my statement was correct, tens of millions do buy 70, 80, and 90 tier cards. I never said individually each one sold tens of millions.
It's a pretty arbitrary decision to lump the $600 cards with the $1600 cards as if there were all part of the same segment. I could just as easily do the same and say 50, 60 and 70 tier cards added together completely dwarf 80/90 tiers.

The fact is, the further up you go in the product stack, the smaller the user share becomes.

Also, that's only one generation. Most people dont upgrade every generation. How many bought the 3000s? 2000s? Those are still in the survey.
I mentioned this in my comment. "If you include past mid-range and entry-level cards, they completely dominate the top of the chart. While if you include past $1000+ cards, you still only get to around 3% total."

You can check it our yourself on the survey. Add together all the 60 tier cards from current and past generations. Then do the same for 70 tier cards. Then 80 tier cards. And then 90/80 Ti flagship cards. It makes a steep downwards line.
 
OK, so the mid range market is big. Guess what/ That doesnt mean nobody buys high end stuff.

This idea of "niche markets cant sell much" is totally false.
So I think what people are mad about is that the bottom 90% of gamers feel ignored and taken advantage of. Then there are the arrogant people who tell the bottom 90% to essentially "stop being poor".

While I'm far from poor, buying cards with this level of price-performance doesn't make financial sense. I can spend a thousand dollars on a graphics card, it doesn't make sense for me to spend $1000 for THAT graphics card.

There are plenty of people who can't afford GPUs but there are also plenty of people like me who look at a product and say, "that makes no sense"

Dollar per FPS used to be a popular metric but faded in the late 2000s, but I think it needs to start being used again. It used to be that it would follow a bell curve. The 60 and 70 series offered the best dollar per FPS. Now the 90 series offers the best dollar per FPS and the FPS get more expensive the lower you go down.
 
Ah yes, 99% of forum commenters hyping 90 class will continue to use their 1650 or 2060/70 while saying AMD are trash.
 
I agree regarding it not going over 2000, at least for the MSRP (some models definitely will be more than 2000, but it's common for many models to be around 200-600 higher than MSRP). Tom from MLID never stated that an MSRP closer to 2500 was likely, not even in his opinion, and he definitely didn't claim that any source was telling him that the MSRP was likely to be close to 2500, only that Nvidia was considering the possibility of pricing it close to that figure, and he stated that he thinks Nvidia might be deliberately leaking that info just to see how people react to that kind of price.

Tom of MLID is a very reliable source for leaks, and has been misrepresented by the author of this article, (Rob Thubron). Many of his leak reports turn out to be extremely accurate. The context and caveats Tom provides with leaked information are very carefully worded, and he very frequently reminds people that even if leaked information accurately represents what the source believes to be true, plans can change at the last minute, especially for pricing. He often states when his source is only making an educated guess. If that guess turns out to be wrong, that doesn't make MLID an unreliable source of information, because he was only reporting on what one person at a company was guessing anyway, and DIDN'T make the mistake of misrepresenting that guess as what the company is definitely going to do.
Tom is a charlatan. He uses the same caveats and vagueries as those reading fortunes. The zodiac signs don't impact your life and neither to MLID's predictions.

But props to him for making money off his youtube side hustle.
 
Nvidia is pushing things so far I'm begining to fell sory for those in need/want a new GPU.
 
This "the best or nothing" mindset is exhausting. Too many people don't remember the dark times when people transitioned from CRTs to LCDs.
Man were those days ever a mixed blessing for me. My eyes are very sensitive to CRTs under 90Hz, they make the veins in my eyes pop. So whenever LCDs became an option I immediately got one. Cost a small fortune, had a crap resolution and the contrast was awful. But hey I could use a computer without triggering a massive headache or my eyes bleeding.

If you're already on a somewhat decent monitor investing in a screen is a good idea imo.
 
Ah yes, 99% of forum commenters hyping 90 class will continue to use their 1650 or 2060/70 while saying AMD are trash.
I think the 90 class deserves the hype, what I think many people are arguing is that the 90 card class's hype doesn't justify the 60 and 70 class card's price.
Man were those days ever a mixed blessing for me. My eyes are very sensitive to CRTs under 90Hz, they make the veins in my eyes pop. So whenever LCDs became an option I immediately got one. Cost a small fortune, had a crap resolution and the contrast was awful. But hey I could use a computer without triggering a massive headache or my eyes bleeding.

If you're already on a somewhat decent monitor investing in a screen is a good idea imo.
My first LCD was a 19" 1440x900 at I think 72hz or something weird over duallink DVI. It was just a basic TFT. Had tons of backlight bleed and ghosting was horrible, but I loved it going from my 15"(?) 1024x768 CRT. then I got a 24" 1920x1200 in 2008. It was a VA panel, but it was a MASSIVE upgrade from my 19" that I got in I think 2006

Kids these days don't know how good they have it.
 
Well as the RTX4090 is already past that 2500 dollar price here I can certainly see the RTX5090 going well above the 3000 dollar mark easily even though I can afford one I don't see the need for it I don't use DLSS, DLAA or RTX and I don't play at 4k res
 
I am buying 5090 on release, could not care less if its 1800, 2000 or 2500 dollars really. Will last me for 5 years easily.
 
So I think what people are mad about is that the bottom 90% of gamers feel ignored and taken advantage of. Then there are the arrogant people who tell the bottom 90% to essentially "stop being poor".

While I'm far from poor, buying cards with this level of price-performance doesn't make financial sense. I can spend a thousand dollars on a graphics card, it doesn't make sense for me to spend $1000 for THAT graphics card.

There are plenty of people who can't afford GPUs but there are also plenty of people like me who look at a product and say, "that makes no sense"

Dollar per FPS used to be a popular metric but faded in the late 2000s, but I think it needs to start being used again. It used to be that it would follow a bell curve. The 60 and 70 series offered the best dollar per FPS. Now the 90 series offers the best dollar per FPS and the FPS get more expensive the lower you go down.

Dollar per FPS was NEVER a metric. Ever. Not ever. Inflation and supply lines make a metric like that a complete and utter impossibility. Especially these days when 10%+ inflation per year is the norm. Then you include the fact that software is constantly being improved and upgraded? A new game coming out right now will plummet that FPS, while the price remains the same.

Your comparison to older cards in previous years is completely and utterly useless information. Things cost what they cost and nothing you do will change it. So, either accept the way things are or move on.

As to your comments on OLED...they're still about 20 years from being viable, if not more. Burn-in is STILL absolutely an issue, even on high-end TV's and monitors. Had a buddy get a top of the line OLED monitor for Christmas 2023, burn-in in less than six months.

Also, the OLED monitors, while providing a superior picture, are not THAT much of a gap. It's maybe a 10% improvement at best...and costs at minimum twice as much. No, a new GPU will absolutely be a larger gaming improvement over an OLED.
 
Dollar per FPS was NEVER a metric. Ever. Not ever. Inflation and supply lines make a metric like that a complete and utter impossibility. Especially these days when 10%+ inflation per year is the norm. Then you include the fact that software is constantly being improved and upgraded? A new game coming out right now will plummet that FPS, while the price remains the same.

Your comparison to older cards in previous years is completely and utterly useless information. Things cost what they cost and nothing you do will change it. So, either accept the way things are or move on.

As to your comments on OLED...they're still about 20 years from being viable, if not more. Burn-in is STILL absolutely an issue, even on high-end TV's and monitors. Had a buddy get a top of the line OLED monitor for Christmas 2023, burn-in in less than six months.

Also, the OLED monitors, while providing a superior picture, are not THAT much of a gap. It's maybe a 10% improvement at best...and costs at minimum twice as much. No, a new GPU will absolutely be a larger gaming improvement over an OLED.
Dollar per FPS was used and heavily criticized in the 2000s. However, I think it is import to bring back NOW because it demonstrates just how hard nVidia is ****ing consumers and how bad of a value their cards are the further you go down the stack.

And I don't care about inflation, I get an automatic 2% raise every year, but when prices double over 2-3 years my raise doesn't matter. dollar per FPS would show everyone just how stupid the GPU economy is right now.

And OLED tech blows LCD tech out of the water. Now that they are getting priced close to a disposable commodity, dealing with burn in isn't really am issue. The 3 year old OLED in my laptop and the 5 year old on in my phone are fine so I don't think it's really an issue.
 
Here we all are again, still talking about Nvidia and MSRP in the same sentence.
They couldn't be further apart being at opposite ends of the same book.
 
Dollar per FPS was NEVER a metric. Ever. Not ever. Inflation and supply lines make a metric like that a complete and utter impossibility. Especially these days when 10%+ inflation per year is the norm. Then you include the fact that software is constantly being improved and upgraded? A new game coming out right now will plummet that FPS, while the price remains the same.

Your comparison to older cards in previous years is completely and utterly useless information. Things cost what they cost and nothing you do will change it. So, either accept the way things are or move on.

As to your comments on OLED...they're still about 20 years from being viable, if not more. Burn-in is STILL absolutely an issue, even on high-end TV's and monitors. Had a buddy get a top of the line OLED monitor for Christmas 2023, burn-in in less than six months.

Also, the OLED monitors, while providing a superior picture, are not THAT much of a gap. It's maybe a 10% improvement at best...and costs at minimum twice as much. No, a new GPU will absolutely be a larger gaming improvement over an OLED.

High-end OLED monitors are night and day better than LCD monitors. You have not seen one for sure. The step up is next level.

How do I know? Because I made that jump. Immersion went up 10 times due to perfect blacks, insane constrast and motion clarity that nears CRT at 150 Hz back in the days.

LCD is a smeary mess compared to OLED, only TN panels with 500+ Hz and BFI even comes close. Also, LCD has like 1:1000-2000 constrast and poor black levels, bad viewing angles, bad uniformity, halo and backlight bleed, corner glow (ips and va), terrible backlighting issues and I could keep on. LCD is a mess and always have been. A huge compromise.

Burn-in on OLED is not a problem at all these days, unless you try to do it on purpose. That is why all companies go OLED in high-end products now. I had OLED TVs since 2016 and never seen burn-in. I even still have the first one in my kids room and it works flawlessly even with hours upon hours on the same cartoons and disney channel at full brightness.

OLED TVs won best TV for the last 5+ years pretty much. LCD is not even close.

You are simply spreading misinformation about OLED.
LCD is dying as we speak. Should only be used in low to mid tier stuff.
Name on high-end phone that used LCD lately? None. Because OLED is expected on high-end.

Looking at the high-end TV market, OLED also dominate. LCD is mostly used in cheap TVs.

And this is why Samsung Display and LG Display both left LCD and focusses 100% on OLED and micro LED now. They buy cheap LCD panels from 3rd party and did since 2022 or so.
 
PC gaming has been getting stupider by the day. We haven't gotten more than 1-2 blockbuster games a year anyway these last 5 years and so it beats me as to who they are selling these to. No thank you, I'll just get the PS5 pro and get 90% of the graphics since the engines are all made for consoles anyway.
 
PC gaming has been getting stupider by the day. We haven't gotten more than 1-2 blockbuster games a year anyway these last 5 years and so it beats me as to who they are selling these to. No thank you, I'll just get the PS5 pro and get 90% of the graphics since the engines are all made for consoles anyway.
Engines are not made for consoles at all, actually AMD hardware underperforms in many engines right now, like UE5 and Snowdrop. Many games has wonky performance on console due to weak hardware and rushed games.

Also, PS5 Pro does not deliver 90% of the graphics at all. You might get 50% of the graphics and performance if lucky, vs a high-end PC.

The GPU inside PS5 Pro is at 7700XT level, then upscaled.

PS5 Pro uses about 200 watts for the entire system, obviously you are nowhere near a high-end PC. Not even a mid-end PC really.

Some console generations ago, they would actually optimize games well, not today. Most are just rushed out the gate and fixed over xx patches just like on PC. Zero difference. A console today, is just a weaker PC, x86, same tech.

Also, console is a money sink without mod support or much control over how games look and feel. Paid multiplayer, paid cloud saves, always remember you will get milked. This is how they make money of it. Expensive games, paid everything.

PS5 Pro = 1000 dollars with disc drive. CPU is not better than base PS5 meaning that you will get the same fps in CPU bound scenarios. Here upscaling won't help.
 
Last edited:
My current recommendation for people looking for an upgrade is to go OLED if they haven't already. I think prices have gotten to the point where money is better spent on a display than a graphics card. You can get a nice OLED monitor for the cost of a midranged card. Burn in isn't the issue it once was, either.
Idk why people keep sayin its not an issue. Is it for people that game 2-5 hours? What about others that use the monitor for 10? 20? There are many offical tests that show every MSI brand new monitor having ALREADY burn issues. A few months of normal PC usage and work. You know, going to get water, eating food for like 15m and letting the screen work, maybe falling asleep 1 time a month. Nothing insane for a human being. Watching your second monitor, while ur OLED just sits there afk, burning... even if its not for long, its enough to cause an issue. Reviews already show this around 40 days ago. Monitors from July already have this issue. It will get worse next year too. You cant heal or fix this. Then what? having to buy another 800-1200 bucks monitor? I know I can't live with tons of visual artifacts...

For the record, im writing this on my IPS screen now, it took me like 10m. For 10m, I dont touch my TV and 2 monitors. If they were OLED, they would surely have issues within 1 or 2 years. I cant just consume consume movies and games. Typing, reading news takes time. Responding takes time. Working on 1 screen while the other 1 sits there... takes time. Even worse, if I had 3 OLED screens... my god. Ill game on 1 screen, while the other 2 will sit there and do nothing. Turn them off you say? No way lol. I use them for youtube, not full screen one cus its too big. Other display? I use for some stats and monitoring (or other browser for news)

All in all, OLED is not perfect at all. Not my words either, check hardware unboxed, many other youtubers. They show pure facts, so u cant say they lie. I have OLED phone, fridge, AC, and mini display in my kitchen. ALL have burn issues. Its just not good. I got 0 issues on my 10-20 year old displays. I still use them in fact.
 
Idk why people keep sayin its not an issue. Is it for people that game 2-5 hours? What about others that use the monitor for 10? 20? There are many offical tests that show every MSI brand new monitor having ALREADY burn issues. A few months of normal PC usage and work. You know, going to get water, eating food for like 15m and letting the screen work, maybe falling asleep 1 time a month. Nothing insane for a human being. Watching your second monitor, while ur OLED just sits there afk, burning... even if its not for long, its enough to cause an issue. Reviews already show this around 40 days ago. Monitors from July already have this issue. It will get worse next year too. You cant heal or fix this. Then what? having to buy another 800-1200 bucks monitor? I know I can't live with tons of visual artifacts...

For the record, im writing this on my IPS screen now, it took me like 10m. For 10m, I dont touch my TV and 2 monitors. If they were OLED, they would surely have issues within 1 or 2 years. I cant just consume consume movies and games. Typing, reading news takes time. Responding takes time. Working on 1 screen while the other 1 sits there... takes time. Even worse, if I had 3 OLED screens... my god. Ill game on 1 screen, while the other 2 will sit there and do nothing. Turn them off you say? No way lol. I use them for youtube, not full screen one cus its too big. Other display? I use for some stats and monitoring (or other browser for news)

All in all, OLED is not perfect at all. Not my words either, check hardware unboxed, many other youtubers. They show pure facts, so u cant say they lie. I have OLED phone, fridge, AC, and mini display in my kitchen. ALL have burn issues. Its just not good. I got 0 issues on my 10-20 year old displays. I still use them in fact.
Let me be more specific, OLED prices have gotten to the point where replacing them after 2 years is an acceptable option and I make the disclosure when I make the recommendation.

That said, what you're saying about OLED burn in is outright false, OLEDs don't start to burn in after 10 minutes. Those tests that you see with Burn in? They're designed as a worst case scenario and they run them like that 24/7 for months at a time. So unless you watch CNN all day everyday, which you sound like the type of person who does, you aren't going to experience burn in. I have had my S21+ for over 3 years at this point and have experienced zero OLED burn in. OLED burn in hasn't been a real world issue for a few years now.
 
Back