Nvidia's GPU Classes Through the Years: What to Expect from the RTX 5080

All 5080 needs to do, is deliver 4090 performance or close, for less.

AMD is not competing in this segment.

Expect 1200 dollars on release.
Eventually, maybe price will drop to 1000 dollars.

AMDs best SKU in Radeon 8000 series will probably barely compete with RTX 5060.

Rumours claim top RDNA4 SKU will get 7900GRE/4070 performance with improved RT performance (compared to RDNA3), so lets hope price is going to be sub 500 dollars.
 
Last edited:
There is a tiny chance of 5080 being 999 dollars, considering Nvidia uses TSMC 5nm again. However I don't see why Nvidia should settle, as it will sell easily at 1199 dollars.

4090 performance for 500 dollars less? Many people will take that deal.
 
There is a tiny chance of 5080 being 999 dollars, considering Nvidia uses TSMC 5nm again. However I don't see why Nvidia should settle, as it will sell easily at 1199 dollars.

4090 performance for 500 dollars less? Many people will take that deal.
Hmm perhaps but it will be foolish to pay 1200 usd for a 16 gb gpu at this point. 16gb isn't enough for 4k rt gaming in quite a few games.

I'll wait for the 24gb version (if there is one).
 
The only reason the 4090 was able to push so much was because of passing from Samsung nodes to TSMC nodes.

Expect another good old 40% uplift this time...

reality.jpg
 
IMO, NVIDIA is basically forced into their terrible pricing scheme. At this point, if they were to actually engage in a competitive price war with AMD and Intel, NVIDIA would get to 97+ percent market share within a generation or two, if not completely forcing AMD and Intel out of the market. This is strongly evidenced by the fact that they have successfully demonstrated with the 40 series that they can legitimately compete in terms of sales and product placement despite shifting their product stack down a die size.

If they won the GPU market though, they would almost certainly face the wrath of antitrust lawsuits, regulators and the like and probably lose tens or even hundreds of billions against those anyways.

So NVIDIA instead wisely (unfortunately) chooses to continue increasing their margins. Their overpriced lineup is their way of inviting the illusion of competition to keep regulators off their backs.

This is an unintended consequence of antitrust laws by the way, but, it’s clearly easier for most to stick their heads in the sand and continue demonizing large corporations…

 
So NVIDIA instead wisely (unfortunately) chooses to continue increasing their margins. Their overpriced lineup is their way of inviting the illusion of competition to keep regulators off their backs.
That is the hardship of being the best GPU company.
 
I would price the 5090 at $2.5k MSRP. Whales have proven time and again that they would gladly pay any amount of money to own a 90 tier card so I would hit them hard and without mercy. It's not like 2-3 grand means anything to a whale anyway.

The 5080 is another good opportunity to skin the fools since AMD is not competing at this tier. You want an 80 series card? You can have one starting at $1399 MSRP.

5070 is another story. Harder to skin -70 tier customers as the amount of buyers with common sense increases at this tier so it rly depends on how bad or good AMD will do relative to NVIDIA in this generation of GPUs.
 
AMDs best SKU in Radeon 8000 series will probably barely compete with RTX 5060.
I wouldn't go that far. It seems from the rumors that RDNA4 will go up to the $500-ish range, while history tells us the RTX 5060 will be another $300-ish 8 GB card that is barely faster than the RTX 4060.

This is strongly evidenced by the fact that they have successfully demonstrated with the 40 series that they can legitimately compete in terms of sales and product placement despite shifting their product stack down a die size.
That isn't quite true. If you look at the Steam survey, you'll see adoption for the RTX 4000 cards is awful in the consumer market. We're 2 years in and RTX 4000 cards are just a bit over 15% of the Steam userbase. That is RTX 2000 levels of bad. To put it in perspective, the GTX 1060 alone peaked at almost 17% of Steam's userbase back in 2018. One single Pascal GPU was as popular as the entire RTX 4000 series added together. If you compare 4060 to 1060, the 4060 sold only about half as much (17% vs 9%).

Now, Nvidia doesn't care about this because the consumer market means little to them today, they're making all their money on datacenter/AI. But the notion that Nvidia can jack up prices and people will buy anyway is not correct, the RTX 4000 series definitely did not sell as well as it could have.
 
I wouldn't go that far. It seems from the rumors that RDNA4 will go up to the $500-ish range, while history tells us the RTX 5060 will be another $300-ish 8 GB card that is barely faster than the RTX 4060.


That isn't quite true. If you look at the Steam survey, you'll see adoption for the RTX 4000 cards is awful in the consumer market. We're 2 years in and RTX 4000 cards are just a bit over 15% of the Steam userbase. That is RTX 2000 levels of bad. To put it in perspective, the GTX 1060 alone peaked at almost 17% of Steam's userbase back in 2018. One single Pascal GPU was as popular as the entire RTX 4000 series added together. If you compare 4060 to 1060, the 4060 sold only about half as much (17% vs 9%).

Now, Nvidia doesn't care about this because the consumer market means little to them today, they're making all their money on datacenter/AI. But the notion that Nvidia can jack up prices and people will buy anyway is not correct, the RTX 4000 series definitely did not sell as well as it could have.

4000 series adoption is just fine:


Don't forget that awful in Nvidia terms, is AMDs dreams come true when it comes to sales.

4000 series delivered TSMC 4N, which was massively more efficienct that Samsung 8/10nm that 3000 series used.

4000 series sold in massive numbers still. Nvidias problem tho, is that many people did not have money to buy them, because of inflation and you needed to go up in 4070 minimum for them to make any sense.

4060 series were meh. Just like AMDs 7600 series.
Performance per watt tho, was GOOD.
 
Last edited:
I've always suspected Nvidia has better tech than what they release IN SMALL STEPS to the general public.
And the last thing they want is for people to buy a card that's so good that it will be their last upgrade. Thát card will NEVER be released.

As for myself; I don't need to be able to run all the new games with ultra settings in super high resolutions, and don't get me started on nonsense like raytracing.
 
Hmm perhaps but it will be foolish to pay 1200 usd for a 16 gb gpu at this point. 16gb isn't enough for 4k rt gaming in quite a few games.

I'll wait for the 24gb version (if there is one).
And you think most PC gamers care for 4K/UHD with RT enabled, why? 99% of PC gamers use 1440p or lower and barely anyone are enabling RT. If they do, they enable upscaling as well in most cases.

Most people praising high amounts of VRAM are AMD users, and AMD users can't use RT to begin with.

See the irony here?

NOT A SINGLE GAME today needs more than 12GB at 4K/UHD maxed out, unless you enable RAY TRACING or PATH TRACING on top, AT NATIVE 4K that is. The only GPU capable of this, is 4090. Most RASTER ONLY games barely use 8GB at 4K ULTRA.

Link me a SINGLE GAME that uses RASTER ONLY that struggle on a 12GB GPU in native 4K. I am waiting. By struggle I want to see LOW MINIMUM FPS not high VRAM USAGE because many game engines today just allocate most of the VRAM, this does not mean its needed. And this is why 4090 can use 16-20GB in some games, that runs fine on 4070 12GB anyway. SIMPLE ALLOCATION.

99% of PC gamers use 1440p or lower.
99% of PC gamers are not enabling RT at native res, if at all.

That is reality.
 
Last edited:
If Nvidia can rip off the consumer, they will.

Of course its enormously helpful to Nvidias cause that PC Gamers are such a bunch of FOMO suckers when it comes to GPUs.

If AMD can rip off consumers, they will too, sadly they can't.

Business 101. AMD is not competing.
 
Core count, memory bandwidth, etc... not very informative (but nice work collecting it all).

What this article SHOULD have done if they actually wanted to be informative was compare the relative PERFORMANCE of the various cards over the years... They've already done the legwork for this with their in-depth reviews of each card over the years...

I remember getting 3 Titan X cards a few months before the 980Ti came out - and bemoaning the fact that I'd basically thrown almost $1500 away for no real performance increase... (they did have a higher resale value so I was able to switch them out down the line for a couple of 2080Tis (but minus the awesome liquid cooling system I'd made for the 3 Titans).

I don't care (and I suspect most users won't either) how many cores/memory bandwidth the 5080 has... I want to know how well will it PERFORM. Will it be 80% of the 5090? will it be 90%? That's what this article should have focused on.
 
There is a tiny chance of 5080 being 999 dollars, considering Nvidia uses TSMC 5nm again. However I don't see why Nvidia should settle, as it will sell easily at 1199 dollars.

4090 performance for 500 dollars less? Many people will take that deal.
Jensen laughs in dollar signs at this post
 
You can look at specs all you want, what it all comes down to, is performance and features for the end-user!

If Nvidia releases a GPU with 128 bit bus, that beats current 256 bit bus cards, who cares what bus width it uses. Like seriously! Barely anyone cares if just performance is better.

Also you can't compare specs directly across brands. Different architectures.

Performance per watt is the best GPU metric and this is where RTX 4000 series outshines all over cards.
 
The 4080's biggest fault was its price, it's about 50% faster than the 3080 which is a phenomenal upgrade considering how big of an upgrade the 3080 already was to the 2080. Jumping from $699 though to $1199 was ridiculous! It priced longtime 80 series buyers, like myself, out of the 80 series. My biggest hope for the 5080 is that Nvidia gets back to pricing that allows me to continue to be an 80 series consumer. The 4080 Super reduced it back to $999, but that is still too expensive.
 
Core count, memory bandwidth, etc... not very informative (but nice work collecting it all).

What this article SHOULD have done if they actually wanted to be informative was compare the relative PERFORMANCE of the various cards over the years... They've already done the legwork for this with their in-depth reviews of each card over the years...

I remember getting 3 Titan X cards a few months before the 980Ti came out - and bemoaning the fact that I'd basically thrown almost $1500 away for no real performance increase... (they did have a higher resale value so I was able to switch them out down the line for a couple of 2080Tis (but minus the awesome liquid cooling system I'd made for the 3 Titans).

I don't care (and I suspect most users won't either) how many cores/memory bandwidth the 5080 has... I want to know how well will it PERFORM. Will it be 80% of the 5090? will it be 90%? That's what this article should have focused on.
Check my post, it is all there...
 
The 4080's biggest fault was its price, it's about 50% faster than the 3080 which is a phenomenal upgrade considering how big of an upgrade the 3080 already was to the 2080. Jumping from $699 though to $1199 was ridiculous! It priced longtime 80 series buyers, like myself, out of the 80 series. My biggest hope for the 5080 is that Nvidia gets back to pricing that allows me to continue to be an 80 series consumer. The 4080 Super reduced it back to $999, but that is still too expensive.

Who cares if you use 60, 70, 80 or 90 series as long as you are satisfied with performance?

You feel better as a human when you own a 80 series GPU? Why not buy a 90 then!
 
4000 series adoption is just fine:

Did you even read the comment you were replying to? I mentioned the Steam survey myself. The link you just posted shows exactly what I said, 4000 series adoption is NOT fine. It was adopted at the same speed as the 2000 series (also a commercial disappointment), and at a fraction of the pace of the 1000 series.

I literally gave you figures comparing the 1060 (peaked at just under 17%) compared to 4060 (peaked at just under 9%), where do you think those figures came from?

Don't forget that awful in Nvidia terms, is AMDs dreams come true when it comes to sales.
That's not the point. AMD has sold much less than Nvidia for a decade now. AMD has nothing to do with this, the problem is Nvidia has consistently failed to reproduce the commercial success they had with Maxwell and Pascal in the consumer market (2000 series flopped, 3000 series was hindered by the pandemic, and now 4000 series flopped again). There are tons of customers sitting on old Nvidia cards and Nvidia is incapable of enticing them into upgrading. The only difference is that, unlike with the 2000 series, this time they don't care because they're making bank in the datacenter market instead.

4000 series delivered TSMC 4N, which was massively more efficienct that Samsung 8/10nm that 3000 series used.
And yet usage on Steam is not any better than it was for the 3000 series 2 years after launch, despite the fact that the 3000 series was limited by the pandemic shortages, and crypto boom with miners snatching everything they could.

That is the perfect illustration of how poorly the 4000 series sold. The scraps of the 3000 series that were in use on Steam at the end of 2022 match the unimpeded, full availability sales of the 4000 series at the end of 2024, both being 2 years away from their respective launches.

In contrast, at the end of 2018, Pascal cards were 37% of Steam, more than double the usage share that Ampere (pandemic/crypto) and Ada Lovelace (terrible value) achieved two years after launch.
 
Did you even read the comment you were replying to? I mentioned the Steam survey myself. The link you just posted shows exactly what I said, 4000 series adoption is NOT fine. It was adopted at the same speed as the 2000 series (also a commercial disappointment), and at a fraction of the pace of the 1000 series.

I literally gave you figures comparing the 1060 (peaked at just under 17%) compared to 4060 (peaked at just under 9%), where do you think those figures came from?


That's not the point. AMD has sold much less than Nvidia for a decade now. AMD has nothing to do with this, the problem is Nvidia has consistently failed to reproduce the commercial success they had with Maxwell and Pascal in the consumer market (2000 series flopped, 3000 series was hindered by the pandemic, and now 4000 series flopped again). There are tons of customers sitting on old Nvidia cards and Nvidia is incapable of enticing them into upgrading. The only difference is that, unlike with the 2000 series, this time they don't care because they're making bank in the datacenter market instead.


And yet usage on Steam is not any better than it was for the 3000 series 2 years after launch, despite the fact that the 3000 series was limited by the pandemic shortages, and crypto boom with miners snatching everything they could.

That is the perfect illustration of how poorly the 4000 series sold. The scraps of the 3000 series that were in use on Steam at the end of 2022 match the unimpeded, full availability sales of the 4000 series at the end of 2024, both being 2 years away from their respective launches.

In contrast, at the end of 2018, Pascal cards were 37% of Steam, more than double the usage share that Ampere (pandemic/crypto) and Ada Lovelace (terrible value) achieved two years after launch.

Are you blind? 4000 is listed all the time in the top 50 GPU list. Fun fact: 7900XTX did not even make top 50.

The only reason 3000 is higher, is because its a generation older and sold for cheaper + second hand market. No-one cares, as 4000 series are superior anyway.

4000 is better than 3000 series by far.
3000 series is made on the crappy Samsung 10nm process, hence the terrible performance per watt.

Tons of 3000 series GPUs were sold second hand due to GPU mining dying and 3060 series is still being made and sold today and you wonder why its number one on the list? haha! People are cheapskates and will settle.

To me, you sound like a silly 3000 series owner not willing to accept that 4000 series is better.

4000 series is made on peak TSMC 4N process, hence the vastly better performance per watt and cool temps.

3000 series were made on the very cheap Samsung 10nm process, which was even worse than TSMC 12nm and Intel 14nm when looking at density and performance per watt.

By the way, Samsung 8nm is actually their 10nm node renamed. This is a fact. Exactly the same density, just a new name, nothing else, thats why I say 10nm and not 8nm because it is all a lie.

Samsung 10nm was enough to beat AMD. Nvidia were smart leaving TSMC which was massively booked and paid 1/3 of the price for Samsung to make the 3000 series. However, still cheap and bad silicon, which is why most people undervolt their 3080 and 3090 cards. 3060 and 3070 did fare way better in terms of efficiency, yet still nothing as good as 4000 series.
 
Last edited:
$3000 for the 5090, $1500 for the 5080, $1000 for the 5070?

I really wouldn't be surprised at this point.
 
Back