A 12GB Nvidia RTX 5070 could debut next January alongside the RTX 5080 and RTX 5090

AMD have not been competitive at all outside of rasterization performance only and even here, they loose in most newer games.

Most people buys a GPU looking at the entire featureset, not just rasterization performance.

Also, AMD drivers are worse than Nvidias. Games simply run worse overall, where Nvidia provides a solid good experience across the board.

True, also, focussing on raster performance will be the death of AMD GPUs. However, AMD is not doing that, they said RT has been a big focus for Radeon 8000 series.

Lacking RT performance is the reason why AMD GPUs fall behind Nvidia in many new games. UE5 uses lumen in most big games for lighting, it is essentially software RT, meaning RT is used even if RT is "turned off" in the game. The only thing this switch does, is change to hadware RT.

Silent Hill 2 is the latest game to use "forced RT" - Many more is coming tho.


AMD performance is not looking good. They are competing with last gen Nvidia products.
 
It doesn't matter what AMD does, nVidia fanbois like yourself will still buy their product.
It doesn't matter what Nvidia does, AMD fanbois like yourself will still ***** and moan about it. See, it works both ways ;)

I don't care about the brand name on the box, I just buy the hardware that I consider best for me at the moment. So, I used to buy Intel processors, now I buy AMD. I used to buy AMD's GPUs, now I buy Nvidia's. The whole mix of features vs raw performance vs price is simply better on their side and that's all there is to that.

The moment AMD can match or surpass it, I will consider switching, but it doesn't seem likely in the foreseeable future.
 
It doesn't matter what Nvidia does, AMD fanbois like yourself will still ***** and moan about it. See, it works both ways ;)

I don't care about the brand name on the box, I just buy the hardware that I consider best for me at the moment. So, I used to buy Intel processors, now I buy AMD. I used to buy AMD's GPUs, now I buy Nvidia's. The whole mix of features vs raw performance vs price is simply better on their side and that's all there is to that.

The moment AMD can match or surpass it, I will consider switching, but it doesn't seem likely in the foreseeable future.
True, I am often being called a fanboy left and right, yet I use both Intel, AMD and Nvidia in my house, and I have several of all chips :p
 
Seems a bit later than usual. Way over 2 year period.
Yeah Nvidia had no reason to rush it, no competition really and AMD will target same performance levels we have now (hopefully with improved RT performance and maybe FSR 4)

Both Nvidia and AMD is chasing AI - and both hold back with next gen products because it is not really needed. Both Nvidia and AMD won't deliver anything truly great for next gen, just more of the same, slightly better performance per dollar, outside of 5090 that will be crazy fast and expensive.

Expect
5090 = New top card, replacing 4090 as the top consumer card.
5080 = 4090 or close
5070 = 4080
5060 = 4070

Personally I hope for some Ti versions in between, not that I am going to buy anything, my 4090 still goes strong and will last till 6080/6090 or AMD equal in a few years. However AMD will only be considered if they manage to get FSR on DLSS/DLAA level and if they improve RT performance, because more and more games uses forced RT these days.
 
Last edited:
I'm still rocking a 3080 TI since 2021. It performs great in the games I play. My eventual upgrade will switch to AMD though simply due to their openness and better support of Linux.
 
It doesn't matter what Nvidia does, AMD fanbois like yourself will still ***** and moan about it. See, it works both ways ;)

I don't care about the brand name on the box, I just buy the hardware that I consider best for me at the moment. So, I used to buy Intel processors, now I buy AMD. I used to buy AMD's GPUs, now I buy Nvidia's. The whole mix of features vs raw performance vs price is simply better on their side and that's all there is to that.

The moment AMD can match or surpass it, I will consider switching, but it doesn't seem likely in the foreseeable future.
That's not how it works. The person I was talking about has been replying to my posts where I haven't even said anything about AMD trying to bait me into arguments. Maybe nVidia Troll is more accurate? Seem to be a lot of those these days.
 
VRAM usage is not over 90% textures.
Wow, great argument dude.

Yes, it literally is. In raster only games, it's basically 95% textures and most of the rest is the framebuffers, everything else that goes on VRAM is a negligible portion of it. In games with ray tracing, or when you use frame generation, more VRAM is used by those features, but textures are still the vast majority of VRAM usage even in those cases.

3060 Ti beats 3060 in pretty much all games regardless of settings, even in brand new games at much higher resolution than these GPUs can handle.
Completely irrelevant to what I said. I didn't say the 3060 beat the 3060 Ti, I said the 3060 Ti (and 3070, and 3080 10 GB) can't use the same texture quality settings the 3060 can, which is an objective, undeniable fact.

Those extra 4GB VRAM don't help when GPU is like 40% weaker.
JFC. Do I need to repeat the same comment again?

Yes, the extra 4 GB of VRAM absolutely does help any GPU, even one that is 40% weaker, because it allows you to use higher quality textures, and textures don't affect framerate. Literally any GPU benefits from more VRAM. It doesn't matter how much slower the 3060 is compared to the 3060 Ti, the 3060 can use higher quality textures because it has 12 GB and the 3060 Ti doesn't. The 3060 Ti will produce higher framerates because of the faster GPU core, but will still have to use lower quality textures than the 3060 can.
 
AMD has been EXTREMELY competitive in the low end and mid range. Dollar per dollar, AMD beats Nvidia at nearly every price point in both raster and Ray tracing.
AMD has only been competitive in raster, VRAM size, and DaVinci Resolve (video ei. They are still losing in a lot of stuff especially 3D and efficiency.

NVIDIA's $300 card going against AMD's $500 card? Sure, be my guest here.

Also, AMD drivers are worse than Nvidias. Games simply run worse overall, where Nvidia provides a solid good experience across the board.

Yeah, "worse". I'm willing to sacrifice a lot of stuff for stability buying AMD, since their MESA driver is baked into the kernel. Definitely not looking forward to fight against my own PC by buying NVIDIA.
They are on the right path right now, but still not fully since their feature sets are still locked behind their proprietary driver.

3060 12GB was beat by 3060 Ti 8GB in tons of games and still are. VRAM don't really matter when GPU is weak. 3060 had a weak GPU, 5070 won't have.
VRAM matter when it matter. 3060 12GB is a better card for high res 3D workload compared to 3060 Ti.

UE5 uses lumen in most big games for lighting, it is essentially software RT, meaning RT is used even if RT is "turned off" in the game. The only thing this switch does, is change to hadware RT.

Silent Hill 2 is the latest game to use "forced RT" - Many more is coming tho.


AMD performance is not looking good. They are competing with last gen Nvidia products.

Most UE5's games has been unoptimized shites. You should check the comments on the link you provided first before commenting about it.
 
Wow, great argument dude.

Yes, it literally is. In raster only games, it's basically 95% textures and most of the rest is the framebuffers, everything else that goes on VRAM is a negligible portion of it. In games with ray tracing, or when you use frame generation, more VRAM is used by those features, but textures are still the vast majority of VRAM usage even in those cases.


Completely irrelevant to what I said. I didn't say the 3060 beat the 3060 Ti, I said the 3060 Ti (and 3070, and 3080 10 GB) can't use the same texture quality settings the 3060 can, which is an objective, undeniable fact.


JFC. Do I need to repeat the same comment again?

Yes, the extra 4 GB of VRAM absolutely does help any GPU, even one that is 40% weaker, because it allows you to use higher quality textures, and textures don't affect framerate. Literally any GPU benefits from more VRAM. It doesn't matter how much slower the 3060 is compared to the 3060 Ti, the 3060 can use higher quality textures because it has 12 GB and the 3060 Ti doesn't. The 3060 Ti will produce higher framerates because of the faster GPU core, but will still have to use lower quality textures than the 3060 can.
You don't know how game engines work, at all.

The fact you claim textures is 90% of VRAM usage, tells me enough. You have zeroooo experience with game engines and you probably don't even know that most VRAM usage is just allocation, not a requirement.

Most PC games are made for the majority of PC gamers - sales numbers is what matters - and 75% of PC gamers have 8GB VRAM tops. Barely any new games uses more than 8GB, only if you look the ALLOCATION on cards like 4090 with 24GB. VRAM allocation on a high-end GPU don't translate to VRAM requirement for a lower end GPU, with less VRAM. This is basic knowledge and most people don't seem to understand this.

Looking at VRAM usage for the most demanding games running at 4K/UHD with RT or even Path Tracing + Frame Gen makes no sense for most PC gamers. Not a single AMD GPU is capable and 99% of people will use upscaling with settings like this anyway.

People choose the most extreme settings running at 4K in the most demanding games when they claim that PC gamers need 16GB minimum. In reality, 99% of PC gamers use 1440p or less, don't enable RT and don't necessary max out games.

All this, is the reason why Nvidia's lower VRAM is a non-issue in reality. They beat AMD in pretty much all new games anyway, in the resolutions that they actually are using, which means 1080p and 1440p usually.



AMD GPUs struggle in most new games mostly due to lacking the capability to run advanced features like RT. RT is used in many new games, even if you don't enable RT. The RT setting is to change between software and hardware RT.

This is why AMD has been focussing alot on RT performance with RDNA4. Current AMD GPUs is going to age like milk going forward, as more and more games use RT regardless of settings. VRAM won't save you. GPU power and capability is what you need.

Game developers don't want to spend time doing fake lighting aka baked lighting, which is why RT will be used more and more in new games.
 
Last edited:
You don't know how game engines work, at all.

The fact you claim textures is 90% of VRAM usage, tells me enough. You have zeroooo experience with game engines and you probably don't even know that most VRAM usage is just allocation, not a requirement.
All I see here is some "hurr you don't actually know how it works" nonsense with zero substance to back it up. Nobody will believe you if you claim to be experienced with something ("you clearly don't have expereince with game engines, I do") but then don't actually bother explaining the things you're talking about.

Yes, that statement is absolutely correct. VRAM usage is 90% textures. It has nothing whatsoever to do with allocation, 90% of the VRAM that is used (not allocated) is textures. Textures are the only significantly large assets that are put in VRAM. 3D models (which are just vectors) are much smaller. Framebuffers (which are bitmaps) can be numerous, but also smaller in the order of a few MBs each (a 32-bit 1080p framebuffer is just over 8 MB, a 1440p one is just under 15 MB), you'll maybe have a couple hundred MBs worth of framebuffers per game. Shader code is tiny (in the order of KBs) and easily fits in small GPU caches. The only other item that can use a lot of space in VRAM in BVH structures for ray tracing. But if you're not using ray tracing, the vast overwhelming majority of your VRAM usage will be just textures.

Most PC games are made for the majority of PC gamers - sales numbers is what matters - and 75% of PC gamers have 8GB VRAM tops.
Wrong. Most games are made for consoles. And both the PS5 and the Series X have around 10 GB of memory to use as VRAM (on the Series X it's 10 due to how the asymmetric memory setup where teh last 6 GB are slower, on the PS5 it's more flexible and can go over 10 GB depending on how little CPU RAM the game needs).

Smart devs will give the PC version an option to degrade texture quality so those games will run well on 8 GB cards. But not every dev is a smart dev.

Barely any new games uses more than 8GB, only if you look the ALLOCATION on cards like 4090 with 24GB. VRAM allocation on a high-end GPU don't translate to VRAM requirement for a lower end GPU, with less VRAM. This is basic knowledge and most people don't seem to understand this.

People choose the most extreme settings running at 4K in the most demanding games when they claim that PC gamers need 16GB minimum. In reality, 99% of PC gamers use 1440p or less, don't enable RT and don't necessary max out games.
Absolutely incorrect. Steve himself, right here on TechSpot and on Hardware Unboxed, has done multiple videos addressing exactly this nonsense claim that "games don't need more than 8 GB".

Here is Steve comparing the RTX 3070 and RX 6800 three years after their launch, and showing how the RTX 3070 struggles in those games, even at 1080p, specifically because of its 8 GB of VRAM, while the RX 6800 with 16 GB doesn't.

Here is a follow-up to that video where he compares the RTX 3070 to a Quadro-class RTX A4000 based on the same Ampere chip but with 16 GB instead, again showing the 3070 struggles specifically because it has only 8 GB.

Here is Daniel Owen comparing the 8 GB and 16 GB versions of the RTX 4060 Ti and arriving at the same conclusion, the 8 GB card struggles in modern games while the 16 GB version doesn't.

Again, those are games running at 1080p in the videos, not "4K with the most extreme settings". And you can see that, depending on the game, that VRAM limitation can take either the form of degraded performance (lower averages and much lower 1% lows), degraded visual quality (games like Hogwarts Legacy, Forspoken and Halo Infinite simply unload textures when you run out of VRAM and you get a blurry mess on the screen), or both.

Turns out the "basic knowledge that most people don't understand" is actually something you didn't understand.

AMD GPUs struggle in most new games mostly due to lacking the capability to run advanced features like RT. RT is used in many new games, even if you don't enable RT. The RT setting is to change between software and hardware RT.

This is why AMD has been focussing alot on RT performance with RDNA4. Current AMD GPUs is going to age like milk going forward, as more and more games use RT regardless of settings. VRAM won't save you. GPU power and capability is what you need.

Game developers don't want to spend time doing fake lighting aka baked lighting, which is why RT will be used more and more in new games.
You then proceeded to go on an AMD tirade that has nothing whatsoever to do with the VRAM issue at hand.
 
All I see here is some "hurr you don't actually know how it works" nonsense with zero substance to back it up. Nobody will believe you if you claim to be experienced with something ("you clearly don't have expereince with game engines, I do") but then don't actually bother explaining the things you're talking about.

Yes, that statement is absolutely correct. VRAM usage is 90% textures. It has nothing whatsoever to do with allocation, 90% of the VRAM that is used (not allocated) is textures. Textures are the only significantly large assets that are put in VRAM. 3D models (which are just vectors) are much smaller. Framebuffers (which are bitmaps) can be numerous, but also smaller in the order of a few MBs each (a 32-bit 1080p framebuffer is just over 8 MB, a 1440p one is just under 15 MB), you'll maybe have a couple hundred MBs worth of framebuffers per game. Shader code is tiny (in the order of KBs) and easily fits in small GPU caches. The only other item that can use a lot of space in VRAM in BVH structures for ray tracing. But if you're not using ray tracing, the vast overwhelming majority of your VRAM usage will be just textures.


Wrong. Most games are made for consoles. And both the PS5 and the Series X have around 10 GB of memory to use as VRAM (on the Series X it's 10 due to how the asymmetric memory setup where teh last 6 GB are slower, on the PS5 it's more flexible and can go over 10 GB depending on how little CPU RAM the game needs).

Smart devs will give the PC version an option to degrade texture quality so those games will run well on 8 GB cards. But not every dev is a smart dev.


Absolutely incorrect. Steve himself, right here on TechSpot and on Hardware Unboxed, has done multiple videos addressing exactly this nonsense claim that "games don't need more than 8 GB".

Here is Steve comparing the RTX 3070 and RX 6800 three years after their launch, and showing how the RTX 3070 struggles in those games, even at 1080p, specifically because of its 8 GB of VRAM, while the RX 6800 with 16 GB doesn't.

Here is a follow-up to that video where he compares the RTX 3070 to a Quadro-class RTX A4000 based on the same Ampere chip but with 16 GB instead, again showing the 3070 struggles specifically because it has only 8 GB.

Here is Daniel Owen comparing the 8 GB and 16 GB versions of the RTX 4060 Ti and arriving at the same conclusion, the 8 GB card struggles in modern games while the 16 GB version doesn't.

Again, those are games running at 1080p in the videos, not "4K with the most extreme settings". And you can see that, depending on the game, that VRAM limitation can take either the form of degraded performance (lower averages and much lower 1% lows), degraded visual quality (games like Hogwarts Legacy, Forspoken and Halo Infinite simply unload textures when you run out of VRAM and you get a blurry mess on the screen), or both.

Turns out the "basic knowledge that most people don't understand" is actually something you didn't understand.


You then proceeded to go on an AMD tirade that has nothing whatsoever to do with the VRAM issue at hand.
There is no issue really, you mentioned a few games, that still runs well on 8GB after patches. Just like some people claim that The Last of Us (AMD sponsored console port to be exact) don't run on 8GB GPUs, it runs just fine after the first patches came out.

Also, PC games are optimized for PC obviously.

Even games developed for console / AMD in the first place, tends to run better on Nvidia anyway:


7900XTX and its 24GB is closer to 4070 series than 4090/4080 series.

You can ramble all day long about VRAM and how important it is, Nvidia has 90% marketshare and still gains. Obviously AMDs approach with offering more VRAM while focussing on pure-raster performance did not work. Nvidia wins with ease in 90% of new games coming out.


16GB last gen Radeon 6800/6900 series with the same performance as last gen Nvidia 8GB offerings don't impress nobody.

And yeah, 3060 Ti 8GB destroys 6700XT and 3060, both with 12GB. VRAM don't help when GPU is weak, like I said.

3070 8GB even beats 6900XT 16GB in 1440p maxed out INCLUDING MINIMUM FPS. That is just sad and the reason why AMD GPU is not selling well. Not to mention that DLSS/DLAA beats FSR every single time.

4060 Ti 8GB is generally a crappy card, yet performs the same as 7800XT and 6900XT here. Enable DLSS/FSR and the 4060 Ti would be the superior option in terms of visuals.

AMD makes good CPUs but their GPUs needs work. Not just GPUs, software as well. Feature-wise AMD is years behind Nvidia. AMDs copy/paste methods did not work well.

AMD left high-end GPU market for a reason. Lets see if Radeon 8000 will change anything. Only if AMD improves FSR and brings vastly superior performance per dollar it will, oh and improves RT performance alot, because AMD GPUs takes a big performance hit in too many new games due to lacking RT perf.

Most popular game of the year:


AMD has lacking perf once again.

You look at VRAM, I look at performance in new games and Nvidia is the clear winner.

Not that I care much, I have 4090 but I did not buy this card because of the 24GB VRAM. I bought it because of the GPU power and feature-set.

The VRAM present on a card like 7900XTX is a waste, since the GPU can't do RT or Path Tracing anyway, and this is where you actually need alot of VRAM.

Not a single game uses alot of VRAM in rasterization only which AMD GPU owners seems to praise.

VRAM is only good if you actually needs it. If you don't need to run games on highest settings + RT in native 4K/UHD you simply don't need more than 12GB really, maybe 16GB in rare edge cases at 3440x1440 or so. Regular 1440p and max settings, 12GB is more than plenty, as you can see in the 3 links I provided, which are some of the most popular AAA PC games right now and brand new.
 
Last edited:
There is no issue really, you mentioned a few games, that still runs well on 8GB after patches.
Did you even watch the videos? They literally show multiple games absolutely not running well on 8 GB cards. Go back and watch them.

Also, PC games are optimized for PC obviously.
You can't "optimize" how much VRAM textures consume. You can only lower texture resolution. That not "optimization", that's just a lower setting with lower visual quality.

Even games developed for console / AMD in the first place, tends to run better on Nvidia anyway:
Oh wow, one single game that you cherry picked?

Per-game variations in GPU performance have always been a thing. You can see in this image from a TechSpot review that Ratchet & Clank runs better on Nvidia, while Forza Horizon and Starfield run better on AMD. Spiderman runs equally well on either. All of those are first-party console games.

4K-p.webp


You can ramble all day long about VRAM and how important it is, Nvidia has 90% marketshare and still gains. Obviously AMDs approach with offering more VRAM while focussing on pure-raster performance did not work. Nvidia wins with ease in 90% of new games coming out.
Why the hell do you keep bringing AMD up? Who the hell cares? This isn't a Nvidia vs. AMD thing, the undeniable fact is that 8 GB cards (from both Nvidia and AMD) are no longer enough to run modern games at high visual quality. The video I linked clearly shows that if you take a 8 GB card like the RTX 3070 and give it 16 GB instead (comparing it to the RTX A4000), all those VRAM issues in modern games disappear.

The RTX 3060 doesn't have these issues. The RTX 4070 doesn't have these issues. This isn't about Nvidia, it's about 8 GB not being enough.

And yeah, 3060 Ti 8GB destroys 6700XT and 3060, both with 12GB. VRAM don't help when GPU is weak, like I said.
Wrong. VRAM helps by allowing GPUs to use better textures, regardless of how fast they are. Texture settings don't affect framerate, all you need for them is to have enough VRAM.

And again, the videos I linked show multiple modern games having issues on 8 GB cards like the 3060 Ti, while the 3060 and 6700 XT don't have those issues. Watch the videos.

3070 8GB even beats 6900XT 16GB in 1440p maxed out INCLUDING MINIMUM FPS.
The video I linked shows the 3070 losing badly to the RX 6800 in multiple modern games. Again, watch the goddamn videos. It's right there, you're commenting this literally right under teh video that disproves everything you're saying.

4060 Ti 8GB is generally a crappy card.
Funny how the 16 GB version of it doesn't have issues, right? I wonder what the difference is.

AMD makes good CPUs but their GPUs needs work. Not just GPUs, software as well. Feature-wise AMD is years behind Nvidia. AMDs copy/paste methods did not work well.

AMD left high-end GPU market for a reason. Lets see if Radeon 8000 will change anything. Only if AMD improves FSR and brings vastly superior performance per dollar it will, oh and improves RT performance alot, because AMD GPUs takes a big performance hit in too many new games due to lacking RT perf.
Again, what does any of this have to do with the VRAM subject? Why are you making this fanboy hour over here? This has nothing whatsoeevr to do with AMD, this is about 8 GB cards no being good enough for modern games. AMD also has 8 GB cards and they also have the same issues that Nvidia's 8 GB cards have.

VRAM is only good if you actually needs it. If you don't need to run games on highest settings + RT in native 4K/UHD you simply don't need more than 12GB really, maybe 16GB in rare edge cases at 3440x1440 or so.
Nobody is saying you need more than 12 or 16. I'm saying you need more than 8.
 
Yes it is.


Irrelevant. The point is that the 3060 Ti cannot match the texture quality the 3060 can use, because it doesn't have enough VRAM for that.


Yes, it does. It helps them by allowing them to use higher quality textures.


And yet, it cannot match the 3060 in texture quality.

I'll happily repeat all of this as many times as needed until it finally gets through your thick skull.

3060 Ti 8GB beats 3060 12GB in 99% of new games still, and the edgecases where 8GB is not enough, the 3060 won't even run those settings because GPU is too weak anyway. Nah textures is not 90% of VRAM. This is 1080p GPUs today, or 1440p with lower settings. GPU wise they can't max out games at 1440p. So VRAM is pointless.

Besides, the diff between High and Ultra textures in new games is pretty much nothing. Sometimes it is just slighly less compressed at Ultra, or uncompressed at ultra. Visually close to identical.

You can repeat everything, could not care less, because I know for a fact that 99% of new games don't need more than 8GB at 1440p and less, which is what 99% of PC gamers use.

AMDs 16GB GPUs are beat by Nvidias 8GB GPUs. Did you want those 3 links of the 3 most popular games in 2024? Instead you link dated numbers from old games. I look at brand new titles that people ACTUALLY PLAY right now and AMDs performance is pretty terrible.


AMD sponsored game. AMD did this many times before. Tex Pack = Higher VRAM requirement, yet no visual improvement. Only thing they want, is to fill out VRAM. Sadly alot of AMDs GPUs lacked the VRAM too. Luckily, no improvement anyway. Pointless texture pack.

Most of the games you speak about, are AMD sponsored, strange yeah? Most of them even came out RIGHT AFTER AMD spoke about how VRAM was important. RE4, The Last of Us, came out right after that marketing speech. I look at reality and actual gaming performance in popular games TODAY, not old console ports.

You can repeat your BS all day long, could not care less. You have nothing to back it up, I have tons of proof that current games games run better on cards with less VRAM. Accept reality.

Or did you buy into AMDs VRAM marketing? I know it's hard to see Nvidia 8GB last gen offerings beat AMDs 16GB offerings.

If you truly think textures is 90% of VRAM, what do you think about Nvidia NTC, Neutral Texture Compression, that will deliver better visuals at half the VRAM usage?

Also, https://www.tomshardware.com/pc-com...n-rivals-nvidias-texture-compression-research

VRAM usage in the future could easily go down from here.
Better Compression > More VRAM.
 
Last edited:
We can thank AMD for not competing.
Yea blame AMD for what Nvidia is doing lol. When AMD dies off and Intel lays off everyone. Who will people blame when RTX 8080 is actually a 8050 in disguise? Sold for 1900?
Yea competition is a thing but come on. AMD did have some amazing cards, and nobody bought them. 4060 is like the worst card in history, and its in the top 10 cards on steam. My god, people are blind.

AMD cards that were faster, cheaper or both in some cases. Actually in 2 cases. Tons of Vram too. Raytracing? Really? Thats the excuse? I got an RTX card from Nvidia, I never used that lagfest of a setting. That makes the GPU work extra hard for almost nothing. Pathtracing is cool, but only a few games have it. It wasnt a good reason 1-2 years ago. It still isnt. Raytracing is super meh. So if thats the reason, even tho most people dont even know how to change their in game settings... okay then. AMD is the 1 to be blamed. Bad RTX performanzeee. SUX cards. All I know is, Nvidia will be more ballsy, just wait. Things wont look good for gamers.
 
With DLSS on? That I can believe.
Given both cards support DLSS it would be extremely misleading if they had compared the 4090 with its DLSS turned off to a 5080 with DLSS on but with nVidia you never know! :)

I expect it is under very specific circumstances - probably with ray tracing enabled as the new 50 series are supposedly RT monsters - and in very specific games - so a best case.

I'll be very suprised if it's really as fast as the claim but we will see.
 
Back