Consistant 60 FPS on old system

Hi All,

I sm perplexed. I haveca new pc - 8600k, new MB, 8 gigs ram, nvidia 980Ti gfx card. But the game runs poorly even on lower settings with drops to 40-20 FPS in Rattay.

I installed the game in my wife’s pc. It has 2600k with Ati 290x and set the exact same GFX settings and the game runs flawlessly at 60 fps with vsync with no framerate drops at all anywhere either in fighting, in the forest, towns or Rattay.

Ahy ideas? Dies the game favor Ati cards or us tyere a setting im missing?

Thank you! Dimon

Not really (an ATI game)
How much memory memory on Wife’s pc?

Hello friend. Wife’s pc has 16 gigs, ddr3 new pc has 8 gigs ddr4. Any thoughts? New pc should be smashing this…,…

it has got to be the memory.

I went from 8 to 16…it is like night/day better!

I7 4790, 1080TI, 16 ram

[H]ardOCP reported issues with VEGA ATI cards, but no hawaii(290) were tested

This is from GAMEGPU. it shows that the 980 has the advantage

The number of cores and thread channels matters also.

I have 16 gig ram, 8gig vram but 4 cores and 4 threads @ 39 ghz and get below 40FPS at best.
I have a Radion 580 card at 1700MHz

Should have 6 cores.

That ought to be enough.

I have 4, and 4 “HT” cores…I can tell every now and then my games are CPU limited…but not much or often

Random internet guy says:
"I’m getting pretty good results on my 980ti at 1440p. Mixture of max and medium settings, turning down shadows, shaders, and particles helps a lot, along with lower lod settings.

Still though first game in awhile that makes me want to upgrade. Bring on the 2080ti!
Main: 5930k@4.6Ghz 1.26v | Asus X99-A USB3.1 | 32GB 3200Mhz 13-13-13-33 | EVGA GTX 980TI@1530Mhz "

Often it is a case of ‘you get what you pay for’; (good news is: the new amd parts punch above their weight…)

That 2600k system would have been top dollar. Probably had a great motherboard (and the controllers they typically come with).

The new rig having 8Gb of RAM is a contention; not so much the amount of it (i still give cheap gaming systems that much), but it is most likely a single stick of RAM- this stops the motherboard/cpu/ram passing data optimally.
Wont affect all products but KCD is of a rare handful of games that push the bleeding edge. (Ignore all the ignorants who dont understand the tech- and think ultra/quality should be equal performance in all games)… make sure your RAM is in a dual channel config.
Should have 2 x RAM for DUAL channel RAM access.
Cpu-z (freeware and doesnt require installation) can confirm your RAM setup.

Not all mhz are created equal… those early i7s had so much raw cpu to burn that intel kinda stopped pushing bleeding edge and started focusing on power optimisations… they are now the low point of cpus better than consoles; but will remain that way forever… games should continue to run well on that cpu (if same games run alright on consoles).

Hyperthreading isnt like a whole core. Most benchmarks with KCD reveal that hyperthreading is nearly equal to a 25%boost (not a 50%boost).
The six core/twelve thread intels murder this game. I have the cheapest one and have dropped my overclocks since the 1.2 KCD patch that made much better use of cores…

Most stuff between the rigs have subtle differences- it is fair to say that the 290x was likely the type of card they had in mind (or in their systems) when developing the game…

I have the lowest of the fury cards, the 290x being the mainstream equivalent parts (we are close- the 290x may have more RAM but the fury has more shaders and the forward thinking architecture).

My fury/6c12t intel/m2 ssd/16gb of quad channel RAM murder this game.
My framerate averages are about 45… only thing I have lowered is shadows and antialiasing a notch. (All sliders maxxed; Ultra quality)
My rig is testamony to three-four year old mid tier rigs being able to play this game very well.

I would hazzard the biggest differ3nce you are seeing right now is tobdo with present build of the game not remembering settings all the time.

Check of the old machine hasnt redefaulted to everything on High. (Might explain numbers)

Check new rig is using dual channel ram. (Otherwise big bottlenecks like the lows you are encountering)

Yes games will favour amd cards.
Amd cards are in all major consoles: where majority of players play their games and for whom games are built.

To extract the most from a graphics card it helps if card features are used. Consoles being relatively low powered- they have to have their feautures utilised to punch above their weight. (Hence why a few first party studios make Xbox x games for 4k- not cause it is easy; but because microsoft want to sell it as 4k capable so it pulls every trick!)

Nvidia made a decision with fermi architecture (quite awhile ago now) to split there shader types; some long shaders/some short.
It basically required dedicating a team of software engineers to optimise every game for each card; nvidia did that and even hoodwinked consumers into believeing they were the better company for it- not really- look what happens to ‘last years cards’; they stop optimisation for them (until we consumers catch them out and force their hand) -it constanly makes the new low end parts look stronger than last years tip end parts.
Amd give raw performance- it is why,over time, their cards give more and more vs the nvidia equivalent.

I am not a fanboy- i have owned many many top tier nbidia parts- but left them behind when my 780ti only output 8bit colour channels (10bit reserved by drivers for titans), my gtx590 was allowed to do 10bit colour channels so it was an expensive’ upgrade’ that didnt deliver.
Nvidia have declared that if they have too theywill support the free tech freesync (even intel graphics do that)- yes I know they charge money for their special chips used in g-sync (that force an extra frame of delay).

I prefer freesync- my 10bit monitor is freesync- amd are better optimised for (games); and I dont get the dirty feeling of supporting an anti vonsumer company (nvidia).

Yes- cue backlash from nvidia users whonhavemy been biying graphics cards since the eighties and want to quote benchmarks; I understand this stuff. I get the architectual differences- I know what I am talking about.

Ethically speaking AMD.
Price point and LONGTERM performance AMD

buying up consumer tech and burying it just to avoid alawsuit or sell more: Nvidia.

I sold my 780ti when the witcher did physx on the cpu (like they always could- but left the code in an outdated 90s state after aquirimg aegea so that they could claim benefits tondoing physx on their gpus)(magically fixed now that the consoles dont have and nvidia hardware and they still want to gimp games using their blackbox hairworkx/gameworkx etx)

An above post shows the 290x vs the 780ti. Considered competitors when launched- the 780 beating it in most benches. After the next series of nbidia cards launched- the 780s performance ‘dropped’ heavily.
Same thing happened to 9 series owners when yhe 10 series launched.
And next year?
Enjoy your forced upgrade nvidia users. Gor me- power irrelevant- the argument ends right there.

I enjoy owning my pc bits for five years and then moving them on.
Nvidia isnt so great unless you have deep pockets and just keep buying ‘new’. (They do kill gaming (eg physx) in order to seem relevant; enjoy the coolaid)

Yes, your memory needs to be in 2’s

AKA, 2 sticks or 4 sticks.

Thanks guys,

really appreciate all the helpful feedback and support. I am really enjoying this game, maybe unlike any other game I have played. Be great to include a Multiplayer Coop mod at some point.

I was surprised that my wife’s pc was able to play it at all, that alone better than the new pc I just payed a heap for.

Looks like I will need another 8 gig stick of ddr 4.

Cheers, guys,

1 Like

Pretty confidant your 980 should be of near equivalent grunt to my fury (every other game benchmark would confirm). …

That being the case- Id expect your new rig to donthis fame awesome.

Those 980s cannbe made pretty quiet too.
Potential lounge room/hifi rig… (!)

Told you guys theres nothing wrong with the game. If you have a decent rig, KCD flies

Let us know when you get the memory added!

I have now strange temporary HW combo - Core i7 8700K - 6 cores + HT, G970 and 8 GB DDR 2400 (its 3000, but when i tested was on 2400) and… around Rattay i had 60 FPS -Stable at High settings (+ some of them Lowered), but in Rattay in was 30-60 FPS, so 60 FPS Stable depends on GPU part or its hard to achieve…

yea, need a better GPU or lower detail and or resolution.

And windows is funny, I went a LONG time with just 8 ram, but recently, (Windows update 1703 +) I could tell it was holding my PC back. You particular mileage may vary!

As for your RAMemory speed, 2400 is about the best you are gonna get…I may be wrong though…

2400/2666 (with tight timings) shows best bang for buck.

Only a few games gain noticable benefits from memory (speed) improvements

Oddly (not really) they are typically open world games that do get a performance bump with tegards to RAM.

I use an older platform with quad channel ram. Quad channel ram doesnt have bottlenecks and is largely speed unaffecting.
I can run my RAM at 3000. Noticably zippier desktop ‘snap’ up around 2800… I found 2400/2600/2666 to be the best tradeoff (not having to voltage bump my first gen ddr4 sticks too much)…
Then I found I could just run at my system expected 2133 and tighten timings (from 16 to 13s).
No voltage/instability- and made convo load times much shorter (nearly instant), prior to Warhorse improving KCD to have the short convo loads for everyone (a couple of patches ago).

RAM isnt worth investing in (vs extra money on a GPU as an example) and having sufficient RAM is what we need. Having ‘excess’ isnt as beneficial as we would hope.
Having fast RAM doesnt help games a lot- at least from a benchmarking point of view.
It can assist system zippiness- eg alt-tabbing and steam loading became very quick on my system after I tightened my RAM timings and made it quicker.

-the steam speedbump was months ago, before I got yesterdays new stream client that is actually much quicker now. (Opening and closing)

I have issues with performance on my PC too. The devs should release a guide for the settings. I have a Ryzen 1600, GTX 1080, 2666 ram cl 14 and 560mb/s SSD for games. One would assume this hardware should play 60 fps on very high settings, but it can’t. I am still unsure how each setting effects performance and which ones go through the CPU or GPU, one guy says one thing another guy says another thing and there is no answer from the devs.

Pc gamer or one of the review websites had a great write up.

Their cpu was top tier - they found 9 sliders had zero impact on framerate (when set to Ultra).
I find it is situation dependant- I have had ONE bad moment in Rattay courtyard at sunrise(ish) and my framerate went below forties…

I notmally run around with shadows and antialiasing down a notch and render at 50-60 with lows in the 40s.
Today I left everything at max (antialiasing/all sliders up/every setting on Ultra); had some framerate lows in mid thirties- though most of the time still ran 50-60.

I did enjoy the grass rendering shadows on itself. Stunning.

The heavy hitters seem to be shaders and shadows.
Gaming rule of thumb has always been to set resolution to taste and use antialiasing as the final adjustment to set framerate…

Eg when crysis launched a gtx8800 (320mb) could run it all on ultra settings with good framerate @1280x720.

I chose to play crysis on ultra at launch on less than top tier hardware.

With regards to any particular rig guaranteeing ultra and sixty? Thats only half the equation: eg KCD renders open world and draws building internals and has massive polygon count. I have never seen a game render wind so well. It is in the way that not all animations are in sync. Everything is crunching like crazy.

KCD doesnt rely on only one area of performance - it is a very rare gametype (true open world) that needs no bottlenecks to succeed.

Once console versions deliver (smoothly) it will prove a great tech demonstration of what modern games are doing.

Nature is hard. Just like when dragon age and mass effect had massive performance differences (space and spaceships/square buildings requiring much less resources than rolling hills and organic trees; mass effect was 60 frames ultra on laptops whilst the same hardqare barely rendered 30s for dragon age. (At the time they were same engine/studio; the difference being:Nature!))

2018 Ultra in KCD is the new reference Ultra.
Like the old question -"…but can ot run Crysis?"
Forza 7 in 4k @60 is still nothing on forza 3 horizon (an open world title). Forza 7 ultra is easy by comparison.

I just did some testing, 60 fps and 30 fps target. where should I post my data?

Who wants some?

A thread posting sys specs and settings/performance SHOULD exist…

Whether one does/is here -I don’t know…

Just remember peoples- turn vsync off and dont enforce screen refresh rate as upper limit.

Presently I average my nbers based on an upper cap of either 75 or 60… uncapped I often see 90s.