The point has nothing to do with performance, it had to do with supply and demand.And why do you care? Nvidia can use cutdown dies and still beat the competition easily. Not all their cores are cutdown anyway.
The point has nothing to do with performance, it had to do with supply and demand.And why do you care? Nvidia can use cutdown dies and still beat the competition easily. Not all their cores are cutdown anyway.
The steam hardware surveys say otherwise. They're not as popular but a lot of them buy those cards to play games.I'd like to believe that a lot of those sales are going to people who have non-gaming needs for say the extra VRAM or where the performance makes a real difference to their workflow. But I admit I have no data one way or the other.
I also agree with the earlier comment that in some ways it feels like the 4090 was the only well-priced 40-series card. For me that meant I didn't buy any of them.
So instead you buy a ps5 pro for $1100 to play the same games at a dithered 1080p 20 FPS, lose out on decades of past software, and rely entirely on Soyny not turning the store off?PC gaming is dead. Just buy a PS5 Pro. Games are, at least, optimized for consoles.
Buying a 4090 to play at 1080p 30FPS with RT is ludicrous...
This was AMD's attitude back in 2016 with polaris. The result? The 1070, 1070ti, 1080, and 1080ti each individually outsold AMD's ENTIRE polaris lineup, and AMD reversed course 18 months later with Vega.You're right. Tech forums always have this disconnect, where many people don't realize how minuscule the market for high-end components is. The vast overwhelming majority of the market is on the mid-range, and the 60-tier cards outsell the high-end cards by an order of magnitude at least, even when they're awful like they currently are.
Cards like the 4090 and 5090 make zero impact on 99% of the consumer market. What makes or breaks this generation is how good the 5060 (and to a lesser extent the 5070) will be.
The steam survey doesn't tell us anything about the primary motivation for purchasing the card. Plenty of people use their PCs for both work and games (and hobbies, which is a 3rd potential reason.)The steam hardware surveys say otherwise. They're not as popular but a lot of them buy those cards to play games.
Pretending AMD lost marketshare because of Polaris is disingenous. They have been losing marketshare steadly since Kepler launched in 2012, and whether they had a high-end card or not has never made a difference in that trend.This was AMD's attitude back in 2016 with polaris. The result? The 1070, 1070ti, 1080, and 1080ti each individually outsold AMD's ENTIRE polaris lineup, and AMD reversed course 18 months later with Vega.
70 tier, sure.There's tens of millions of high end gamers buying 70, 80, and 90 tier cards. It's not a miniscule market. That's why nvidia keeps making them.
So, in my instance of gaming (ESO and EvE@4k), I use significantly more vram @4k than I do other resolutions. If people remember back in the voodoo2 days, resolution was heavily vram dependant. You couldn't run 800x600with less than 12mb of vram.I'd like to believe that a lot of those sales are going to people who have non-gaming needs for say the extra VRAM or where the performance makes a real difference to their workflow. But I admit I have no data one way or the other.
I also agree with the earlier comment that in some ways it feels like the 4090 was the only well-priced 40-series card. For me that meant I didn't buy any of them.
You're right, anecdotal claims by Techspot comments are far more accurateThe point has nothing to do with performance, it had to do with supply and demand.
This is patently false. Polaris marked a significant drop in AMD's GPU market share, dropping by nearly half and dipping into the single digits for the first time. You can deny the existence of the high end all you want, its there. They buy. Money speaks louder then your beliefs.Pretending AMD lost marketshare because of Polaris is disingenous. They have been losing marketshare steadly since Kepler launched in 2012, and whether they had a high-end card or not has never made a difference in that trend.
So my statement was correct, tens of millions do buy 70, 80, and 90 tier cards. I never said individually each one sold tens of millions.70 tier, sure.
80 and 90 tiers, no there aren't.
If we estimate Steam has about 170 million users (132 million at the end of 2021 was the last official number we got, then increase it by the same proportion daily concurrent peak numbers increased since then: 29.2 million in december 2021, 37.8 million today); and then look at the survey, the 4080, 4080 Super and 4090 together have a marketshare of 2.16%. That means only 3.7 million Steam users bought a 4080+ card.
OK, so the mid range market is big. Guess what/ That doesnt mean nobody buys high end stuff.It's not nothing, but it's a fraction of the 13.6 million that bought a 4060 tier card. Another 9.7 million are using a RTX 3060, which is still being sold. And those are mid-range cards that did not sell well to begin with (compared to the 1060, for example, which alone accounted for 16% of Steam at one point).
If you include past mid-range and entry-level cards, they completely dominate the top of the chart. While if you include past $1000+ cards, you still only get to around 3% total.
You're right. Tech forums always have this disconnect, where many people don't realize how minuscule the market for high-end components is. The vast overwhelming majority of the market is on the mid-range, and the 60-tier cards outsell the high-end cards by an order of magnitude at least, even when they're awful like they currently are.
Cards like the 4090 and 5090 make zero impact on 99% of the consumer market. What makes or breaks this generation is how good the 5060 (and to a lesser extent the 5070) will be.
Between july 2016 (right before before Pascal and Polaris launched) to july 2018 (2 years later, couple months before Turing launched) AMD usershare fell from 25% to 16%. Neither by half, nor to single digits.This is patently false. Polaris marked a significant drop in AMD's GPU market share, dropping by nearly half and dipping into the single digits for the first time. You can deny the existence of the high end all you want, its there. They buy. Money speaks louder then your beliefs.
It's a pretty arbitrary decision to lump the $600 cards with the $1600 cards as if there were all part of the same segment. I could just as easily do the same and say 50, 60 and 70 tier cards added together completely dwarf 80/90 tiers.So my statement was correct, tens of millions do buy 70, 80, and 90 tier cards. I never said individually each one sold tens of millions.
I mentioned this in my comment. "If you include past mid-range and entry-level cards, they completely dominate the top of the chart. While if you include past $1000+ cards, you still only get to around 3% total."Also, that's only one generation. Most people dont upgrade every generation. How many bought the 3000s? 2000s? Those are still in the survey.
So I think what people are mad about is that the bottom 90% of gamers feel ignored and taken advantage of. Then there are the arrogant people who tell the bottom 90% to essentially "stop being poor".OK, so the mid range market is big. Guess what/ That doesnt mean nobody buys high end stuff.
This idea of "niche markets cant sell much" is totally false.
Tom is a charlatan. He uses the same caveats and vagueries as those reading fortunes. The zodiac signs don't impact your life and neither to MLID's predictions.I agree regarding it not going over 2000, at least for the MSRP (some models definitely will be more than 2000, but it's common for many models to be around 200-600 higher than MSRP). Tom from MLID never stated that an MSRP closer to 2500 was likely, not even in his opinion, and he definitely didn't claim that any source was telling him that the MSRP was likely to be close to 2500, only that Nvidia was considering the possibility of pricing it close to that figure, and he stated that he thinks Nvidia might be deliberately leaking that info just to see how people react to that kind of price.
Tom of MLID is a very reliable source for leaks, and has been misrepresented by the author of this article, (Rob Thubron). Many of his leak reports turn out to be extremely accurate. The context and caveats Tom provides with leaked information are very carefully worded, and he very frequently reminds people that even if leaked information accurately represents what the source believes to be true, plans can change at the last minute, especially for pricing. He often states when his source is only making an educated guess. If that guess turns out to be wrong, that doesn't make MLID an unreliable source of information, because he was only reporting on what one person at a company was guessing anyway, and DIDN'T make the mistake of misrepresenting that guess as what the company is definitely going to do.
Man were those days ever a mixed blessing for me. My eyes are very sensitive to CRTs under 90Hz, they make the veins in my eyes pop. So whenever LCDs became an option I immediately got one. Cost a small fortune, had a crap resolution and the contrast was awful. But hey I could use a computer without triggering a massive headache or my eyes bleeding.This "the best or nothing" mindset is exhausting. Too many people don't remember the dark times when people transitioned from CRTs to LCDs.
I think the 90 class deserves the hype, what I think many people are arguing is that the 90 card class's hype doesn't justify the 60 and 70 class card's price.Ah yes, 99% of forum commenters hyping 90 class will continue to use their 1650 or 2060/70 while saying AMD are trash.
My first LCD was a 19" 1440x900 at I think 72hz or something weird over duallink DVI. It was just a basic TFT. Had tons of backlight bleed and ghosting was horrible, but I loved it going from my 15"(?) 1024x768 CRT. then I got a 24" 1920x1200 in 2008. It was a VA panel, but it was a MASSIVE upgrade from my 19" that I got in I think 2006Man were those days ever a mixed blessing for me. My eyes are very sensitive to CRTs under 90Hz, they make the veins in my eyes pop. So whenever LCDs became an option I immediately got one. Cost a small fortune, had a crap resolution and the contrast was awful. But hey I could use a computer without triggering a massive headache or my eyes bleeding.
If you're already on a somewhat decent monitor investing in a screen is a good idea imo.
Ah yes, 99% of forum commenters hyping 90 class will continue to use their 1650 or 2060/70 while saying AMD are trash.
So I think what people are mad about is that the bottom 90% of gamers feel ignored and taken advantage of. Then there are the arrogant people who tell the bottom 90% to essentially "stop being poor".
While I'm far from poor, buying cards with this level of price-performance doesn't make financial sense. I can spend a thousand dollars on a graphics card, it doesn't make sense for me to spend $1000 for THAT graphics card.
There are plenty of people who can't afford GPUs but there are also plenty of people like me who look at a product and say, "that makes no sense"
Dollar per FPS used to be a popular metric but faded in the late 2000s, but I think it needs to start being used again. It used to be that it would follow a bell curve. The 60 and 70 series offered the best dollar per FPS. Now the 90 series offers the best dollar per FPS and the FPS get more expensive the lower you go down.
Dollar per FPS was used and heavily criticized in the 2000s. However, I think it is import to bring back NOW because it demonstrates just how hard nVidia is ****ing consumers and how bad of a value their cards are the further you go down the stack.Dollar per FPS was NEVER a metric. Ever. Not ever. Inflation and supply lines make a metric like that a complete and utter impossibility. Especially these days when 10%+ inflation per year is the norm. Then you include the fact that software is constantly being improved and upgraded? A new game coming out right now will plummet that FPS, while the price remains the same.
Your comparison to older cards in previous years is completely and utterly useless information. Things cost what they cost and nothing you do will change it. So, either accept the way things are or move on.
As to your comments on OLED...they're still about 20 years from being viable, if not more. Burn-in is STILL absolutely an issue, even on high-end TV's and monitors. Had a buddy get a top of the line OLED monitor for Christmas 2023, burn-in in less than six months.
Also, the OLED monitors, while providing a superior picture, are not THAT much of a gap. It's maybe a 10% improvement at best...and costs at minimum twice as much. No, a new GPU will absolutely be a larger gaming improvement over an OLED.
Dollar per FPS was NEVER a metric. Ever. Not ever. Inflation and supply lines make a metric like that a complete and utter impossibility. Especially these days when 10%+ inflation per year is the norm. Then you include the fact that software is constantly being improved and upgraded? A new game coming out right now will plummet that FPS, while the price remains the same.
Your comparison to older cards in previous years is completely and utterly useless information. Things cost what they cost and nothing you do will change it. So, either accept the way things are or move on.
As to your comments on OLED...they're still about 20 years from being viable, if not more. Burn-in is STILL absolutely an issue, even on high-end TV's and monitors. Had a buddy get a top of the line OLED monitor for Christmas 2023, burn-in in less than six months.
Also, the OLED monitors, while providing a superior picture, are not THAT much of a gap. It's maybe a 10% improvement at best...and costs at minimum twice as much. No, a new GPU will absolutely be a larger gaming improvement over an OLED.
Engines are not made for consoles at all, actually AMD hardware underperforms in many engines right now, like UE5 and Snowdrop. Many games has wonky performance on console due to weak hardware and rushed games.PC gaming has been getting stupider by the day. We haven't gotten more than 1-2 blockbuster games a year anyway these last 5 years and so it beats me as to who they are selling these to. No thank you, I'll just get the PS5 pro and get 90% of the graphics since the engines are all made for consoles anyway.
Idk why people keep sayin its not an issue. Is it for people that game 2-5 hours? What about others that use the monitor for 10? 20? There are many offical tests that show every MSI brand new monitor having ALREADY burn issues. A few months of normal PC usage and work. You know, going to get water, eating food for like 15m and letting the screen work, maybe falling asleep 1 time a month. Nothing insane for a human being. Watching your second monitor, while ur OLED just sits there afk, burning... even if its not for long, its enough to cause an issue. Reviews already show this around 40 days ago. Monitors from July already have this issue. It will get worse next year too. You cant heal or fix this. Then what? having to buy another 800-1200 bucks monitor? I know I can't live with tons of visual artifacts...My current recommendation for people looking for an upgrade is to go OLED if they haven't already. I think prices have gotten to the point where money is better spent on a display than a graphics card. You can get a nice OLED monitor for the cost of a midranged card. Burn in isn't the issue it once was, either.
Let me be more specific, OLED prices have gotten to the point where replacing them after 2 years is an acceptable option and I make the disclosure when I make the recommendation.Idk why people keep sayin its not an issue. Is it for people that game 2-5 hours? What about others that use the monitor for 10? 20? There are many offical tests that show every MSI brand new monitor having ALREADY burn issues. A few months of normal PC usage and work. You know, going to get water, eating food for like 15m and letting the screen work, maybe falling asleep 1 time a month. Nothing insane for a human being. Watching your second monitor, while ur OLED just sits there afk, burning... even if its not for long, its enough to cause an issue. Reviews already show this around 40 days ago. Monitors from July already have this issue. It will get worse next year too. You cant heal or fix this. Then what? having to buy another 800-1200 bucks monitor? I know I can't live with tons of visual artifacts...
For the record, im writing this on my IPS screen now, it took me like 10m. For 10m, I dont touch my TV and 2 monitors. If they were OLED, they would surely have issues within 1 or 2 years. I cant just consume consume movies and games. Typing, reading news takes time. Responding takes time. Working on 1 screen while the other 1 sits there... takes time. Even worse, if I had 3 OLED screens... my god. Ill game on 1 screen, while the other 2 will sit there and do nothing. Turn them off you say? No way lol. I use them for youtube, not full screen one cus its too big. Other display? I use for some stats and monitoring (or other browser for news)
All in all, OLED is not perfect at all. Not my words either, check hardware unboxed, many other youtubers. They show pure facts, so u cant say they lie. I have OLED phone, fridge, AC, and mini display in my kitchen. ALL have burn issues. Its just not good. I got 0 issues on my 10-20 year old displays. I still use them in fact.