Misconceptions in tech are endless, but myths about low-res CPU benchmarks top our list. Let's revisit this hot topic and test the new 9800X3D at 4K to clarify what really matters.
Misconceptions in tech are endless, but myths about low-res CPU benchmarks top our list. Let's revisit this hot topic and test the new 9800X3D at 4K to clarify what really matters.
Proper 4k test should be included.Τhere is no question here: Steve and TECHSPOT are in the right.
In the opposite corner, we have a bunch of Intel shills.
During the heyday of Intel CPU's (ca 2014 HASWELL CPUs) these very same ppl would passionately argue for testing CPU's at 720p to show off the prowess of gaming beasts such as the 4790K. Today the very same ppl argue for the opposite: testing at 4K.
Money talks and uh well you know what walks.
*This comes from someone who still owns an Haswell CPU and a 1080.
There were plenty of people that said that testing at 1080p doesn't really show realistic results when Zen and Zen+ were released. Just check the reviews of those.Τhere is no question here: Steve and TECHSPOT are in the right.
In the opposite corner, we have a bunch of Intel shills.
During the heyday of Intel CPU's (ca 2014 HASWELL CPUs) these very same ppl would passionately argue for testing CPU's at 720p to show off the prowess of gaming beasts such as the 4790K. Today the very same ppl argue for the opposite: testing at 4K.
Money talks and uh well you know what walks.
*This comes from someone who still owns an Haswell CPU and a 1080.
Proper 4k test should be included.
If I look at Techspot test, it makes it look like upgrading my 5800x3d will give me +-30fps in games.
If I look at Techpowerup that includes 4k testing in their reviews, I will see that in most games at 4k I will only gain +-3fps. I think the difference between the 9800x3d and 5800x3d was only 8fps average over all the games they use for testing.
So if I was one of those people that just rely on one site, it would seem worthwhile to spend $500 on a system upgrade.
Meanwhile I would only gain a few fps for my outlay.
There's still a "myth" here.
The other elephant in the room are all the "professional reviewers" using RTX 4090s -at 1080P mostly, in those reviews. While your article brushed upon some of the misnomers, but the most popular GPU on the Steam hardware list today is an RTX 3060/4060. Point being, it's rather unpractical for consumers to purchase a $1500-$2400.00 GPU to expound upon CPU gaming "performance". But sure the bar graphs might look better.
So run these same tests with the 9800X3D with a for more common RTX 3060/4060 at 1440P or 4K. Suddenly the "AMD X3D, 'best CPUs for gaming" becomes margin of error. But again, the former sells clicks.
Add one more thing, the multi CCD X3D CPUs like the X900 and X950 X3D CPUs. Now you need, specific bios settings, thread directing chipset drivers, Microsoft Game Bar software, and a service to make "core parking" happen correctly. That's a fragile means to get the "best" performance in gaming. X3D multicore CPUs for mostly gaming above 1080P are an expensive, niche, proposition.
PC Magazine - Best CPUs for gaming.
Average FPS is not what matters when considering a gaming CPU. Exactly zero people experience an average FPS of many games when they play a game.
It's 1% lows in the specific game you're playing which most strongly affect gameplay. Look at those and see of those numbers are good enough for you. If not, then you need a better CPU. Averages are just a general indicator to get you looking at the right range of CPUs.
Take a look in forums of popular PC sites and you will notice people recommending or buying CPUs like the 7800X3D and 9800X3D while using a 4070Ti or lower at 1440p or 4K.There is no myth.
When testing the best CPUs available for gaming, you don't gimp them with a 3/4060 or you will not get any relevant data for how fast it is. You use the best GPU available at low resolutions instead. Steve already addressed this at length in this review.
But the real issue here is:
How effing stupid do people think gamers are? Does anyone making the argument that a 9800X3D is useless for 3/4060 gaming think that someone who spent $2-300 on a GPU would consider a $500-600 CPU? Do you actually think a 3060 gamer is swayed to buying a 9800X3D by a glowing review? Anyone who makes this argument is basically saying that 3060 buyers are a pack of effing *****s.
I give the average gamer one hell of a lot more credit than that. They can very likely figure out that the 9800X3D is the very best gaming CPU out there. And then buy a 7600 or 12600 which is a better match to their system.
"Is the Ryzen 9800X3D Truly Faster for Real-World 4K Gaming?"
You need to change your title. You say 4k gaming but you enable DLSS so the game is being rendered at less than 1440p NOT 4k.
That means there aren't any 4k results in your review.
The problem with this is, we don't know what games you're playing right now or the games you'll be playing in the future.
We also don't know what settings you'll be using, as this depends largely on the minimum frame rate you prefer for gaming.
As mentioned earlier, we recognize that most of you understand this, though a vocal minority may continue to resist low-resolution testing.
Low-resolution testing reveals the CPU's true performance, helping you make informed "cost per frame" comparisons.
For those who still disagree, have you considered why every credible source and trusted industry expert relies heavily on low-resolution gaming benchmarks when evaluating CPU gaming performance? There isn't a single respected source that doesn't.
We stopped including those tests after initially caving to demand, realizing they can be misleading and offer little value.
I don't see a bunch of people complaining that the wrong CPU or brand was handed the crown. To me this discussion is mostly around the lack of additional information around "and how much of that potential difference could I expect to realize for my actual use case?"In the opposite corner, we have a bunch of Intel shills.
I must be pretty effing stupid because I have no idea how to extrapolate from 1080p/4090/low results to 1440p/3080/high going just on paper. Is there a formula for that? I don't believe it's as simple as "none because you're entirely gpu limited" nor as dramatic as the single-case charts might suggest. Where exactly in-between is not an equation I can solve for myself.How effing stupid do people think gamers are?