Stalker 2: Heart of Chornobyl GPU Benchmark

Thanks a million for doing the benchmarks also at High and Medium, because in many other lazy publications, it often seems like only maximum quality matters, wtf!
I know playing at max is often having to pay an exponential performance cost for little improvement, while also the vast majority doesn't have top tier hardware, so this is actually helpful. 👏🏻👍🏻
 
Last edited:
As someone who started coding in the early days of 8 bit computers using various languages and did so all through the 90's, right up until a few years ago. The performance of these newer games tells me that either these new games are terribly unoptimised (poor coding) OR they are deliberately made to run so poorly, so as to drive higher end GPU sales.
I couldn't for the life of me think of who could be "encouraging" this though. I mean, who would want to drive the public to buy more powerful GPU's with some form of upscaling?
I wonder if there's a lot of brown envelopes being passed around these days, again.
 
Unreal Engine 5 is a cancer.
LOL. Steam's survey, while a bag of cr*p, quotes RTX 3060 & 4060 as the most popular GPUs? Tough luck. Sorry, 1080p 30fps is the future.
Now Hardware Unboxed have to retract Their video on "Don't upgrade GPU, upgrade monitor."
 
Finally found a game that uses my 3950x 16 core’s and the CPU seems to be the limiting factor even with and old 2080 S.

It seems an unoptimised mess to me. Graphics when turned down to run the game, are worse than the 15+ year old games that came before it. And obviously they ran on some very limited hardware.

Not totally disappointed, knew it would be full of bugs. Just waiting on the modders to fix it like the last 3 games
 
As someone who started coding in the early days of 8 bit computers using various languages and did so all through the 90's, right up until a few years ago. The performance of these newer games tells me that either these new games are terribly unoptimised (poor coding) OR they are deliberately made to run so poorly, so as to drive higher end GPU sales.
I couldn't for the life of me think of who could be "encouraging" this though. I mean, who would want to drive the public to buy more powerful GPU's with some form of upscaling?
I wonder if there's a lot of brown envelopes being passed around these days, again.
I remember hearing about this years ago, GPU companies “sponsoring” titles with these exact dirty intentions. Lets hope this isn’t the case once again :/
 
Pretty disappointing. Considering the system requirements, I expected the game would be better optimized... looks like GSC pulled a 1990s game dev with the system requirements (minimum requirements: it can run the title screen). Techspot should have tested the 1060 6GB and RX 580, which have been listed as minimum requirements.

Given these performance numbers with more powerful cards, I wonder what their targets for the minimum spec GPUs are. 30 fps on low preset at 1280x720 upscaled from 720x480?
 
I remember hearing about this years ago, GPU companies “sponsoring” titles with these exact dirty intentions. Lets hope this isn’t the case once again :/
Instead of making more affordable GPUs they pay developers to make games more demanding to sell more expensive GPUs while also making mid teir GPUs cost the same as high end ones did pre covid. Can't sacrifice margins, have to convert those loses to the marketing department.
 
What was the testing scene for this bench? Was it gamepass or steam version? I have vastly different results. On the "worse" side
Which version is performing better? I’m currently running on the gamepass version and have not experienced crashes or anything. The biggest issue Iv noticed is the close range spawning of AI, it’s very immersive breaking
 
Which version is performing better? I’m currently running on the gamepass version and have not experienced crashes or anything. The biggest issue Iv noticed is the close range spawning of AI, it’s very immersive breaking
Running gamepass too. Don't have steam ver, I'm gonna try and download "3rd party" version - it's gog version but not purchased if u know what I mean

The issue I'm having is GPU load constistensy. If that wasn't the case, I think I would got the same results Steve got.
 
As someone who started coding in the early days of 8 bit computers using various languages and did so all through the 90's, right up until a few years ago. The performance of these newer games tells me that either these new games are terribly unoptimised (poor coding) OR they are deliberately made to run so poorly, so as to drive higher end GPU sales.
I couldn't for the life of me think of who could be "encouraging" this though. I mean, who would want to drive the public to buy more powerful GPU's with some form of upscaling?
I wonder if there's a lot of brown envelopes being passed around these days, again.
Take off you conspiracy hat for a moment, and no; it's exactly the same thing that happens in all other entertainment media like cinema, tv, series, music.
Tech for visuals and sound is better, but the technical capabilities of the people doing it are second place, it's the flashiness and catchiness that matters most.
Game studios or publishers won't allocate time and budget to optimize nor bother about hiring really technically knowledgeable developers, nor those developers will bother to learn optimizing, because as long as a product looks super cool, it will sell. That's what matters most.
 
As someone who started coding in the early days of 8 bit computers using various languages and did so all through the 90's, right up until a few years ago. The performance of these newer games tells me that either these new games are terribly unoptimised (poor coding) OR they are deliberately made to run so poorly, so as to drive higher end GPU sales.
I couldn't for the life of me think of who could be "encouraging" this though. I mean, who would want to drive the public to buy more powerful GPU's with some form of upscaling?
I wonder if there's a lot of brown envelopes being passed around these days, again.

It's an interesting theory, but I feel if this was happening, it would get whistle blown. There's so many people involved in these games. I think it's more like the modern engines are not the best and the commercial pressures prioritise content and sales over quality. Basically 'barely good enough' is the bar these commercial ventures are shooting for.
 
Finally found a game that uses my 3950x 16 core’s and the CPU seems to be the limiting factor even with and old 2080 S.

It seems an unoptimised mess to me. Graphics when turned down to run the game, are worse than the 15+ year old games that came before it. And obviously they ran on some very limited hardware.

Not totally disappointed, knew it would be full of bugs. Just waiting on the modders to fix it like the last 3 games
Exactly, I was only able to play the entire Clear Sky a few years later with the corresponding "Staker Complete" mod.
 
@Burty117
Now for example, here, Steve did a good job doing full due-diligence and testing every aspect of this game to it's fullest. No complaints here. But this is a game. CPU and even GPU testing is different.
 
As someone who started coding in the early days of 8 bit computers using various languages and did so all through the 90's, right up until a few years ago. The performance of these newer games tells me that either these new games are terribly unoptimised (poor coding) OR they are deliberately made to run so poorly, so as to drive higher end GPU sales.
I couldn't for the life of me think of who could be "encouraging" this though. I mean, who would want to drive the public to buy more powerful GPU's with some form of upscaling?
I wonder if there's a lot of brown envelopes being passed around these days, again.
To be fair, many recent games have extremely detailed graphics with gigabytes upon gigabytes of texture data. And even in the past there were demanding games for their time like Crysis and Batman Arkham City if maxed out with DX11 features enabled. But I think time/money/talent constraints are also to blame for the lack of further optimization. For example, it's insane what Nintendo achieved on the Switch hardware with certain titles like Tears of the Kingdom.
 
Started playing on my intel ARC iGPU - Core Ultra 155H notebook, no discrete GPU. Surprisingly it works...but performance is weird: ~30 fps high settings, quality XeSS upscale, 1600x1000 resolution (OLED display is 3200x2000), with regular drops to 10-12 fps(stutters). Turned down quality to all LOW - get about 40 fps, not much of improvement. "Compiling shaders" on every start of the game is real PITA.
 
4K EPIC DLAA with FG is way better looking than DLSS Quality alone, 80 fps for each scenario in my system testing , 4090 with 5950X.
 
@Burty117
Now for example, here, Steve did a good job doing full due-diligence and testing every aspect of this game to it's fullest. No complaints here. But this is a game. CPU and even GPU testing is different.
This is a GPU benchmark, so he's used lots of GPU's on a single game (because that's the articles title) using various quality settings.

The article you're upset about is very specifically a gaming benchmark between two CPU's. Both articles are exactly as advertised...
 
Pretty likely it's going to get a heck of a lot of clean up patches, because it needs them. I'll play it in six months. It'll run better AND I'll have a better GPU. Same old story I guess.....
 
Back