Ray Tracing?

Hello everyone

So… Before i even think about selling my soul and a few other things in order to get the Nvidia rtx 2080 ti i would like to know if there are plans to add Ray Tracing to the game? :slight_smile:

Shadea

1 Like

The ray tracing was in the beta version and the forest was beautiful, but everything was probably removed due to optimization https://www.youtube.com/watch?v=HJbi0n1Ku-E

1 Like

Advanced lighting model seems to be in the game.

Mostly indoors, and it might be a bugbear to sort out (very hard having a whole township of houses that can be walked inside and out (without load) WITHcomplex lighting models (that do seem to run in some capacity ‘outside’).

Iwas truly blown away when I saw light reflecting off a guards’ shield on the battalments (ie outside). Whilst Iam not confident the ultra lighting model is active always (or should be), Iwould think a future patch is VERYLIKELYto further cripple advanced settings.

So far 80% of the optimisation I see (I’m just an enthusiast and not checking ALL features with every update), is towards the lowest common denominator systems, and often at the expense of quality in the Ultra settings presets.

I truly wish users could educate themselves as to why 2018s Ultra settings shouldn’t yield similar performance to games barely pushing any processors (and no doubt run at 60 frames per second on consoles)…
This games lighting model alone used to be the stuff of benchmarks.
Iread an article yesterday discussing how much it is time for realitime complex lighting models in games…
yet when a game does it- the uneducate slam the game for not yielding the ‘same performance’ as every other game on their system.
(not even getting into the zealots who are so stupid as to compare first party exclusives with multiplats etc…)
(yes Iagree horizon zero dawn is lovely and one of the best examples of a game utilising the full power of the system it runs on- and Iwouldn’t expect anything less from the makers of Killzone Mercenary (Vita), they shoehorned their PS3 engine onto a lesser powered Vita by using the hardware sound chip etc… literally some of the best examples of first party exlusives…)

With regards to the game performing well; it needs more than just a good video card.
My machine is a mixup of parts mostly quite a few years old.
My video card is still relevant: an AMD Fury.
I render the game in Ultra (everything including the detail/distance sliders), with the exception of shadows which Idrop depending on my target framerate.
At regular ‘Full High Def’ I net 45fps lows (with exception of a few broken game models that drop EVERYONESframerate)(and will be fixed in time), generally running around 50-60 frames per second.

On the 21:9 monitor Ihave to drop antialiasing to 1tx, and I generally stay within my freesync range (40-70), meaning Iexperience no screen tearing…
I do drop to 30s sometimes, but Ialso run volumetric fog (@e_volumetricfog=1@), and the framerate lows are during VERY COMPLEXlight models eg… sunrise hitting Rattay… usually when my CPUis processing civilian AI etc…

Remember most of my rig is quite old, but is generally all ‘good enough’ to not hold this game back.
It loads in 10 seconds and Ican ride through Rattay at full gallop and have no ‘pop ins’ etc.

Iwouldn’t buy an nvidia card, and especially not for this game, but that is ‘another story’.
(Iam platform and hardware agnostic, and buy as an ‘ethical consumer’,… nvidia are pure commercial evil and apart of the microsoft/intel ‘death of gaming’ that is taking place…)

These are some shots from Talmberg.
The first daytime/outdoors example I saw of this games’ photon following goodness was a red reflection on a wall from a cast from sunlight off a guards shield.

The effect made my jaw open.
Whilst seeing the same effect indoors, previous to that moment was ‘awesome’, seeing it outdoors explained a lot of my framerate range I was experiencing… (i like many others having grown acustomed to setting all software titles to Ultra and expecting to hit 80 frames per second.

I left my unoverclocked video card show some metrics onscreen.
Tge 42 frames per second is due to a setting in drivers (AMD Chill), and not a reflection (pun) on the card.

I believe the perfect card for this game is a Nano or Fury X- both seriously better than my standard Fury.

With 1070s finally making the second hand rounds, bargains like an AMD fury should be cheap/easier finds (keep a watchful eye)…
An AMD Fury often outpunches a gtx1070 in this title… obviously your mileage may vary (depending on settings/resolution), in general a Fury should leave rx580s in its wake… unless at 1920x1080 or some limited test that makes use of all the extra video card RAM.

Fury is the only 4Gb card still holding upper chart positions in many titles. It is nearly perfect for sub 4K gaming.

Whilst the photos above are not examples of the light scattering; I can confirm that retail ran with the complex light modelling.

Given Warhorses penchant for destroying Ultra settings (many patches gimping Ultra detail settings), who knows how deep the rabbit hole will go.

Ideally custom *.ini files/config files and/or modders can return a lot of Ultra quality goodness. Its is inherant to the engine and available to us… and to see it running- BEST PC GRAPHICS I have ever seen outside of tech demos… (and yes- I run Witcher 3 Ultra plus full hq hairworkx, and battlefield in ultra etc- in 21:9)

1 Like

Warhorse should let the console versions as they are, maybe tweak performance, but please take the PC version and add all these Ultra graphics features!!! There is already a special Ultra setting in the options, so please enable everything what makes the game lookimg better. Don’t care about the performance in the Ultra mode. That’s for what Ultra mode is meant to be: for graphic enthusiasts and future hardware!

So please add raytracing, volumetric fog, and everything that modders can enable in the .ini/config files as default features of the game.

1 Like

I have feeling that everybody talks about different thing here.

Nvidia just announced new graphic cards called RTX (RT stands for ray trace) to suprass the current GTX.
That will introduce ray tracing lighting that allows several things like real time high resolution mirror, bounced light and similar. That is something new and not used in games yet and people will need those new cards for that (not yet available). So that is nothing to just enable or even something that was in beta.

Cryengine uses voxel based global illumination for similar effect for lighting. So instead of shooting rays from the camera and tracing how it bounces and then colour the pixel accordingly, it separate the world into boxes that remembers how much light gets on them from lighting sources or the other nearby boxes. Then this is used to calculate how much light is near them on the enviroment and characters (and what colour the light is - so the red reflection on the screenshot).

To answer the original question, no plans were announced. And I think it is way too early to think about it.

2 Likes

There’s a chance to make an engine update for KCD to CryEngine 5 as many other games got it after release in a later “Enhanced” or “Remastered” version?

Will the next act use that new engine or the same as original KCD?

I’m not a fortuneteller nor a programmer but in my personal opinion I would not count on that kind of remaster.
KCD had to adjust the engine heavily and those custom changes are probably not easy to carry to the new engine version and might be similar to develop those again. I guess if the game would be a shooter from a tropical island it would be much more easier.
That said, it has implemented some features of newer version of the engine, like mentioned Voxel Global Illumination.

As for future project I have no idea.

1 Like

Thanks Wenceslaus

Im more of a fortune teller than programmer-
We are talking about different things; no doubt.

Many years of buying latest video cards and being keen to see their new tech utilised has taught me- buting the latest card for some FUTURE feature is not viable. It seldom pans out!

Either the first gen card of aome new superfeature doesnt have the grunt to play (well) actual softwarw ising the feature or subtle revisions on the evolving tech mean the original cards are generally troublesome or poor at rendering said feature.

Examples (kept extremely short, as Ive owned just about every second year release of video cards from all major players), that Matrox card back in the nineties that implemwnted ‘bump mapping’; took two years for ‘a few’ titles to start using feature. Titles were of lacking quality and were likely bought by other pundits like me looking for a use…
The Apocalypse 5D (two cards in one), had an early PC version of Power VR tech. (Now put to brilliant use, twenty years later, in ipads and mobiles) at the time it was great for the included version of wipeout and that was about it. Generally used the voodoos and their market standard for games… (it was a dual CPU PC with four video cards in the nineties…)

When tech is announced it takes awhile before someone incorporates the features into their engine. Generally need a tech demo to show the feature off to gatner interest.
Then a studio will license said engine and spend four years making a game that will use the feature.

Now- think about the video cards in use five years ago and reflect on how well they would run a cutting edge- tech pushing game… (early adopters would upgrade their cards to see the feature ACTUALLY run, and not slideshow…)

I ran Deus Ex on my hogh end rig and set the graphics up as high as my monitor went… (resolution and in game settings). One frame every thirty seconds…

S m o o t h (!)

When I bought the nvidia fx series (for new features)- that card never saw those shaders used outside of tech demos. By the time games engines used the new features the card was ‘outdated’.

Sure buy powerful cards- just get cards with features you are willing to use today (like raw speed).

A bird in the hand is worth two in the bush…

1 Like

New Tomb Raider will have a playthrough, Battlefield could prove a daily use. …

VR support would also be nice. The game is first person, so this would work great.

Also throughoutly discused, tried by devs with bad results, so abandoned.

Not necessarily… when we factor ‘head tracking’ (camera following head movements), we need a very low latency (think:high framerate),… rendering to two screens is a performance hit (and nvidia/microshaft killed off dual card gaming as the older gen video chips would have flogged the nine and ten series cards (designed in architecture to output the slightly view askew ‘dual screen’, microshaft just want to sell xboxes (no money in pc gaming eg network subscriptions/licensed periphetals etc)). …
So whilst the tech was there… and we were moving towards it…
The big dogs have to make quarterly profits and the people willing to pay follow them religeously…
(5-10% extra speed over a competitors ETHICAL sale isnt worth it and is killing gaming/IT)

Whilst ever consumer follow easily manipulated stats and benchmarks we all suffer.
That super powered gtx780 should massively outpunch a gtx1060, or at least allow an second (cheap) gtx780 to CRUSH virtual reality (multiscreen high refresh rate gaming).

The market dictates that we spend a lot of money for incremental improvements (that generally take time to be programmed/optimised for)…
We used to optimize video drivers for older cards first (the largest install base/most people to benefit)… now nvidia optimise for new card and cripple the old cards (by not optimising)…
Hence why charts like this shiw an AMD 290 outpunching a nvidia gtx780 (a card that was positioned much higher in cost/performance when released)c0a7b2d7d9e0f5c8faf82be0122e62a18647454d_1_500x500

VR is a massive new money spinner so it makes sense everyone is going to cash in on it (if allowed).

Back on topic; VR is doable with simple graphics, ir an optimised world that suspends disbelief and has us thinking more is going on…
KCD is a push and VR KCD is a push too far.
People can do it- sure- but 3-10% doing it well and 90% complaining about performance would kill the product more than help it.

Wgat Warhorse seem to be doing VERY WRONG is not seeing their ‘babys’ future with a long term vision.
If the game is solid: RPGs have a long shelf life.
In five years an iPad will have the grunt to render KCD.
If Ultra presets scale well to future hardware (ie STOP GIMPING ULTRA), then two paths open up-
Portable gaming on phones and tablets in 5-7 years and excited PC tech demoing for every upgrade that PC users make.
That only happens if Ultra has something new to give, which the studios present penchant to gimp Ultra for the masses to run fluid TODAY seems silly.

I have a nice build- it is a few years old- modrange enthusiast when the parts came out.
It represents a small piece of power presently availavle (many PCs are vastly more grunty).
The game runs Ultra preset perfectly at higher than needed framerates (ie north of fourty fps). .
Each time the game gets ‘simplified’ to help systems ‘on the verge’, I lose out.
Im glad for everyone else- but my rig isnt even high end.
I have no reason to squeeze more out of my present rig, and I definately prefer many aspects of this game going back to 1.3 patch etc.
Each succesive patcg generally kills Ultra more than it helps.

Dont hear me wrong- I love bug fixes. I thought cloth physics was an awesome add on. Infreqyent daytime rain is true to life and I can even handle that…
The suspension of disbelief this game had (on launch) was stronger than present implementations (bugs aside).

Warhorse need a vision (not just a reactionary process to consumers they either publicly belittle or generally ignore)
If they had a true product plan in place it would be shared, and third parties could get onboard with commercial mod projects.

Whilst I had a pitch, at one point, they are too chaotic in their output to rely on them, presently.
The best I can now offer is side projects using my own toolset/method.

This game still could be one of the top RPGs of all time.
Studio needs to reclaim their confidence and power and deliver on a vision.
Even if it is just selling us a new vision (at no cost)…
Tell us WHY persevering will pay off!

Actually, it DOES use ray tracing. Its shooting rays through the voxels to gather a GI pass that way.
In truth cryteks SVOGI is an amazing looking realtime GI solution. The problem is that warhrose have opted
for a diffuse pass only. SVOTI is capable of a full GI and looks magnificent.

2 Likes

Is there a comparison between diffuse and full somewhere?

Also , what would be the performance impact by going full GI?

1 Like

Yes, several things in game engines uses ray tracing. And to be honest, the Nvidia Ray Traced RTX is also not fully ray traced, it is hybrid.
But I think it is reasonable to refer to mentioned techniques as Ray traced and Voxel based.

I though that it is only capable of diffuese lighting and there has to be probes for the rest. At least manual said that - I’ve never tried it.

What would be the perfotmance impact?

Exactly the question that needs explanation…

Just like hairworkx shenanigans where nvidia would force lock over the top settings due to the competitors cards having a harder time to ‘pull it off’ (even to yhe deteiment of their own customers), we know that nvidia will come up with some tasty numbers to suggest how powerful and ‘essential’ their tech is.

Until someone byilds a game that the gameplay requires the feature- it isnt ESSENTIAL, and true to any good design - “good enough” to simulate an effect is all we consumers need.

Think daytona usa with clouds reflecting on the windows of cars… there was no offsceen information being rendered onto the car (the whole intent of the new feature of nvidia cards), they simply prebaked an animation.
These RTX cards come across as useful for nvidias development interests (replacing quadro’s market perhaps), and with nothing worthwhile to sell us they are going to go full trump card with this feature. (How much money are they spending to add the feature to tomb raider, ay?)

Games wont require these cards and when games start to do these types of lighting (of which KCD is an early adopter(yes, similar enough to fool the average punter)) whichever ‘light engine’ is resourse light (!)and achieves ‘good enough’ will do.

Ita too far from being practical to invest in it.
Still, kudos to battlefield developers - per usual adopting all technologies, and giving the best from both hardware teams.
Their usage will be what has the world looking… with BF3 they prebaked a lot of lighting effects and it gave the illusion of vastly more tha the eight hardware light sources our video cards generally handled.

Good devs know how to simulate an effect- the illusion of more going on than really is. (Like clouds reflecting on daytona. AM2 boards couldnt pull of these tricks, and it wasnt until nearly twenty years later that PC race games on cutting edge hardware started to do realtime reflections.

Dont pay the nvidia tax!