OK, so we have same GPU. what about CPU? I have noticed that the GTX 1080 can struggle in KCD at 4K. Sometimes even at 1080P/1440P. I honestly don’t think the game engine is very well optimized for NVIDIA 9**/1*** series GPUs, unless you have a 1080 Ti or some other mega-beast shit. But CPU is biggest factor.
I get about 80-fps with my I7 and GTX1080. I’ve got 16gb memory. Performance is always pretty smooth with the settings I play at. Sometimes it dips down into the high 50’s or 60’s, but my monitor has Gsync so I don’t really notice it. I have kept an eye on performance just to see what it’s doing anyways. I’ve checked afterburner , all my temps are fine, my clocks are consistent , everything checks out. It’s just the game
Yeah I have a 2560x1440 144hz Gsync monitor, if I play with the game on high settings , i get about 80fps, If I keep the same settings and go down to 1080p , i get about 110fps. But because they’re both over 60fps, and I have gsync on, the loss of quality isn’t worth the extra 30 frames. Not with a game like this anyway.
@Shady: But exactly WHICH i7? i7 has ben around very long times, the old ones get smashed by KCD. Is it a newer unlocked K model which supports overclocking? I have i7-7700K, it is plenty fast but I’m considering an upgrade.
7th generation Kaby Lake
But like I said, it’s not really a perf issue. I get lots of frames and it runs buttery smooth. I just get weird bugs in game.
@Shady: Then our hardware specs are very similar, sounds like we have same GPU, probably same or close CPU, and same amount of RAM. Don’t know how the fuck you’re getting 80, I get approx 60 @ 1440P and all settings on very high. Motion blur off, antialiasing off. Even in the game settings it says ultra high settings are for future hardware…AKA supercomputers/ uber expensive gaming rigs.
I have everything on high, but 2 settings. I turn shaders and shadows down to medium.
@Shady: OK I see. I like a good balance between performance and visuals. 40+ is good, 60+ is ideal. I would really like to play @ 4K but KCD is an extremely demanding game. I honestly believe the game engine isn’t very well optimized for the latest gen NVIDIAs. On other games I can play @4K, settings maxxed or near maxxed, 60+FPS everywhere most of the time. but not KCD.
I tried playing with everything on high or better, but the frame drops are unbearable. If I go in a forest, or in some areas that are not optimized well, I can tell I’m below 60, and it drives me nuts because I haven’t had to deal with that for years. Plus I usually play faster paced FPS games, so I’ve learned to cope with the sacrifice already. High refresh gaming is weird. Even if you have flagship everything, things get weird. You can run into issues where you’re getting 144fps , but you’re getting constant micro-stutters or frame drops because the CPU is being mauled. I’ve been able to run games smoother by turning graphics settings higher in this regard, because the frame rate lowers, and the CPU can relax. Even 7th gen I7’s weren’t that happy crapping out 144+ frames (in some games) when it was brand new.
In other words, 100% GPU usage is preferable to 100% cpu usage lol. As if that weren’t obvious. But you don’t run into those issues at 60hz.
@Shady: Then the only solution is to befriend Bill Gates, LOL. Get him to buy you terabytes of RAM, dozens of GPUs, dozens of CPUs running in parallel on server-grade hardware, NVMe SSDs.
Oh I don’t really mind most of it. I actually enjoy it. I built my PC early 2017. Right before the Intel security vulnerability BIOS patch and windows patch. So I had to go back and flash the Bios and update the windows after the fact. Not fun. Everybody was jumping on a bandwagon back then telling me how I was going to lose 50% performance from the patch etc. I don’t really see it though. It’s probably <2%
@Shady: Do you have a 120+Hz monitor? VSync on or off? what about NVIDIA G-Sync technology (enabled/disabled)?
its a 144hz monitor, Gsync is always on, but I’m pretty sure that’s happening after the game in the chain of things. I could be wrong, but I think my monitor has a hardware module on the monitor itself that’s changing the display’s refresh rate , whereas the AMD adaptive sync is software.
@Shady: Are you using an actual computer monitor? Or a HD TV as a monitor? Some TVs have issues when used as a PC monitor. Like with scaling, refresh rates, etc.
It’s a monitor. S2716DG
I don’t think there are any 144hz TV’s , are there? I know some of them are 120 and 240hz with post processing, but they’re 60hz native. They use some weird processing technique to fill in frames to fake 120 or 240. There might be a native 120hz tv out now, but it’s probably like 5 grand.
Oh nvm, it’s the bandwidth limitation of HDMI.
the new HDMI will lead to high refresh TV’s if there’s a demand for it, but right now the HDMI is the issue.