Adding buswidth essentialy just adds tracing on the PCB. That's an oversimplification, but increasing buswidth doesn't add much cost. The problem is that some of these features needs 32 and 64bits of bus width to work properly. So now you're waiting for enough bus to open up for the feature to function. You're delaying data that could be used for rendering to make room for upscaling or ray tracing. It gets delayed by a clock cycle or, in some instances, needs to be used every clock cycle. So that 192bit bus on the 5070? It's realisticly a 160 bit bus. When you get down to cards with a 128 bit bus, because of how many things that card needs to do just to function, it really only has 96 bits available. So the faster bandwidth helps, but the smaller the buswidth gets the bigger impact these "necessary features" have on the over all performance of the card. This is why you see massive gains when running DLSS on things the like 80 and 90 class card's but you sometimes see single digit gains on lower spec cards.
I would LOVE to see the technical documentation and testing that proves that these features need 32 or 64 of whatever bits dedicated to JUST that tech to work properly.
The issue, of course, is if that were true, then cards like the 4060 would be totally disproportionate in their performance to higher end cards. Except we dont see that. Those cards are right where you would expect them to be, given their core counts..
Operations on GPUs have been split into multiple clock cycles since the days of the voodoos. It's not like that stuff just stops working.
I seem to remember similar claims back in the day, that the smaller busses could NEVER fill the needs of these newer GPUs and you NEEDED 384/512 bit for high end resolutions. And yet, they were not only fine, but they kicked the *** out of older cards. Cards like the GTX 660, with a 192 bit bus, beating the 384 bit GTX 480 in benchmarks. Funny how that works.
Their linux support is trash and that's really why an nVidia product will never be an option for me. They said they're working on improving it, but I'll believe it when I see it. For the time being, ARC cards have better linux driver support than nvidia cards. My next card is likely going to be an 8800xt or a used 7900xt/xtx if the price is right.
Their drivers work fine on linux. I assume you're talking about the open source drivers. That's a totally lost cause and frankly makes no sense. "Oh I cant use a closed source driver, that's PROPRIETARY! I need OPEN drivers to run my proprietary gaming software!".
They admitted they skipped on development and marketing, they did issue plenty of incentives where the scalpers bought them at Rock bottom prices. Then we had the supply chain issues hit shortly after then everyone could get no new cards to market from either party.
Oh, so again, it wasnt the fault of the consumer, but of the market itself. Interesting.
Frame gen is now considered an essential feature and if you still have a 30 series card, you're gonna need to use FSR to get it. I can't wait to see what new features DLSS 4 brings with the 50 series that 40 series users will need to start using FSR to get.
That's your opinion. FSR is also considered far inferior to many compared to DLSS. Especially with text. I've tried FSR in games, and holy hell it looks awful.
Just remember: EVERYTHING happens for a reason.....
And the reason we are in this convoluted clustercrap sh*tfest is that stupid people continue to do stupid sh*t, like paying nGreediya's outrageous prices for GPU's over & over & over again, which only condones their actions of rising prices & only producing miniscule/pitiful performance improvements with every new generation of cards while constantly charging more for them !
They're just following Intel's lead, who has done the exact same thing for many, many years now....
So.... if you purchased an NV card in the past 5-8 years, this entire situation is YOUR OWN DAMNED FAULT ! Just deal with it
Yeah, consumers should have bought inferior hardware with worse drivers because......good cards bad?
Stop stanning for AMD. They dont care about you. AMD's GPU side has been ALL over the place, dropping out of markets or delaying products or screwing things up. Consumers like consistency, hence why those same consumers that LOVE "ngreedia" cant get enough ryzen processors. Oh wait, those are AMD. Hmmmm..... (BTW, intel held a monopoly on consumer CPUs for a decade, and kept their price increases in line with inflation. AMD gets ahead, and the MOMENT they had the advantage, jack up the prices on CPUs from $200 for 8 cores to $400+. AMD, the way it's meant to be gouged).
If you have AMD GPUs, like me, and you're on the high end, AMD is leaving you high and dry. Again. This is the second time in a decade they've done this. You know what they say, fool me once shame on you, fool me twice.....
When AMD makes a good product, it sells. The RX 6000s sold as soon as they came in stock, their low numbers were AMD's fault, they (correctly) prioritized Epyc production, at the expense of consumer GPUs. The 7000s, when priced right, sold great. OF course, AMD played those ngreedia games and priced their cards wrong to drive consumers to higher end cards, and instead drove them to nvidia. oops.