AMD's Ryzen 7 9800X3D is here, bringing powerful upgrades with 3D V-Cache tech and enhanced thermals. Without question, this is the best CPU released since the 7800X3D, making it a game-changer.
AMD's Ryzen 7 9800X3D is here, bringing powerful upgrades with 3D V-Cache tech and enhanced thermals. Without question, this is the best CPU released since the 7800X3D, making it a game-changer.
... Nvidia 5090.
Since I've gone all in on Linux, AMD GPUs are my only option. Nvidia said they're going to work on improving their linux drivers but I'll believe it when I see. So, for the foreseeable future, AMD GPUs it is. Outside of Cyberpunk. I see raytracing as a disappointment but I do play lots of CP. Aside from that, get ~90fps is most games at 4k with my 6700xt. I'm hoping that the 8800XT will give 4080 levels of performance for ~$500. AMD doesn't need a flagship, they need a midranged monster.Great. now, I hope they will use this knowledge to improve their gpu family. I will buy new amd gpu next year and I know it wont be top of the line, but do hope it will get much better in next iteration.
I'm in the same position and with 6900xt I'm still in a good place for gaming, but it is not always easy to drive my 1600p ultrawide. So I do hope some nice gain with the 8k series, I know this will be bit limited but I just dont want to invest now in 7900xtx. And looking for some gains in davinci resolve as well, even if it works really nicely in a distro box (to utilize better hardware acceleration). And yes, rt on AMD is not the best one, that one area 8k series promises some improvements.Since I've gone all in on Linux, AMD GPUs are my only option. Nvidia said they're going to work on improving their linux drivers but I'll believe it when I see. So, for the foreseeable future, AMD GPUs it is. Outside of Cyberpunk. I see raytracing as a disappointment but I do play lots of CP. Aside from that, get ~90fps is most games at 4k with my 6700xt. I'm hoping that the 8800XT will give 4080 levels of performance for ~$500. AMD doesn't need a flagship, they need a midranged monster.
So then 7900xtx is.an interesting beast because it's a popular card for people who want to play with AI, but don't want to pay for a 4090 or a workstation card. So there is actually a lot of demand for those. The 7900xt, though, is a bargain relative to it. You lose about 10% the performance but they can be found for about 30% less than the XTX on sale fairly oftenI'm in the same position and with 6900xt I'm still in a good place for gaming, but it is not always easy to drive my 1600p ultrawide. So I do hope some nice gain with the 8k series, I know this will be bit limited but I just dont want to invest now in 7900xtx. And looking for some gains in davinci resolve as well, even if it works really nicely in a distro box (to utilize better hardware acceleration). And yes, rt on AMD is not the best one, that one area 8k series promises some improvements.
Makes me wonder if AMD would just be better off going X3D down the whole stack.
If your GPU is at 99% no CPU, not even a 14950 X3d from the future can make it go faster. So 0 impact.I understand why CPUs are tested this way, but can anyone help me understand the likely impact on a non-competitive gamer more oriented to visuals than frame rate? For example say a 4070 class card running high-ultra settings getting 60-90 fps. In this type of scenario where a lot of the bottleneck is going to be on the GPU, does this or any CPU continue to make a noticeable difference, or do the charts all compress to the point where all the CPUs get similar results?
If the CPU can get 200fps in a game, but you play at higher resolutions and your GPU can only get 90fps, the CPU makes minimal difference, it can sometimes help with better minimum frames and less stuttering, but won’t help you get more than your current bottleneck being the GPU.I understand why CPUs are tested this way, but can anyone help me understand the likely impact on a non-competitive gamer more oriented to visuals than frame rate? For example say a 4070 class card running high-ultra settings getting 60-90 fps. In this type of scenario where a lot of the bottleneck is going to be on the GPU, does this or any CPU continue to make a noticeable difference, or do the charts all compress to the point where all the CPUs get similar results?
Interesting, thanks for the response. So I'd take that advice to mean, check your GPU usage, and if it's mostly pegged at 99% you've got enough CPU, and if it's not, then you don't. I should probably start paying more detailed attention, but my general sense is that I'm usually not that high. On the other hand, my CPU is almost never near 99% either, certainly not as an all-core total, but I think usually not on any individual core either. Maybe platform bandwidth (not sure if that's the right word but I mean general ability to move data around the system) is a limiting factor?If your GPU is at 99% no CPU, not even a 14950 X3d from the future can make it go faster. So 0 impact.