Why does my horse's hairdo poke through his horseclothes?

Why does my horse’s hairdo poke through his horseclothes?

Animals usually ravot in their clothes. Horses don´t always like their clothes or are bored, and want to do something. That is just Pebbles being a horse, he can´t help it.

By computing standards- making surfaces in game all having relationships with each other-simulating a range of real world physical properties is very resourse taxing.

Hence why the light scattering used by KCD used to only be in the realm of benchmarking and ‘test suites’.

Whilst there is a lot KCD gets right, clipping of models is typical to every 3d game.
Look for it hard enough/long enough and examples can usually be found.

Studies into gaming find that cartoony graphics allow gamers to ‘imagine’ the world and develop a stronger ‘suspension of disbelief’.
Real world immersive graphics can jar us back to reality over small things.

As an animator who invested a lot more time on horse running animations than many other 3d resource constructions- eventually a software house has to make decisions about just how important every last nuance of realism is.

Whilst it might take less time initially not ‘pre baking’ effects (uses less grunt later)- making the computer follow complex mathmatics in realtime to generate cloud movement, hair bounce/weight/collision etc can make the experience very different (in game for the players) spread over different hardware.

Prebaked stuff was the reflections in arcade Daytona.
The math to reflect trackside objects was more than sixty frames a second arcade gam3s had to devote to ‘an effect’.
Battlefield 3 did amazing stuff with prebaking lighting effects, allowing onscreen more than ‘eight hardware light sources’ (not really, a lot of it being ‘prebaked’).

Prebaking a horse animation, including hair, can happen before horse gets clothes.
Simulating thousands of hair strands, again something that was the realm of demos and final fantasy/stuart little/monsters inc type movies that pushed the feature- can be done, at heavy computational expense, on home computers.

The Witcher 3 simulated hair physics, yet most didnt play with the feature, preferring 25% more framerate.

How much framerate are ye willing to lose to hair simulation?
I loved the Witcher having simulated hair and just wish it hadnt been 'black box’nvidia gimped (used to hurt their own customers knowing it would hurt amd’s lesser tesselating parts even more)

PhysX as the joke it became could have made gaming awesome.
But a company bought it to get a competitive edge that never happened.
Along the way they manipulated and abused the tech and it ended up buried. (I know it has happened many times in the tech world)

I actually had a laptop with dual 8800 gtx video cards with an add on physx co processing unit.
By design they could do feedback physics (use the physical calculations back into the 3d game as more than just an eye candy/non interactive effect) so not only could the window shatter but a person could kick all the little bits of glass around…

Nvidia left the cpu/software side of the physx code outdated (as it was on aquisition/good for showing hardware accelerator improvements), then said we would all need the hardware (now on nvidia cards), then for games like Mafia 2 (a rare gem) gamers might run an nvidia card secondary to their primary (amd) graphics card.
Nvidia stopped this through drivers.
They eventually so killed off user trust in green team’ that the witcher 3 didnt even do hair physx in hardware as by now amd was in every console and the witcher was going to be cross platform- so they finally gave us physx via software.
Like it could always be done.
I had a gtx780/gtx590 through the witcher 3 launch and can say watching the 780gtx be outperformed by amd 290 (equivalent cards) and yield no physx benefits was truly a sad time.
Nvidia also refusing to support open market standards like freesync (yes I know they have to recoup gsync investment costs) made the move away from them very easy.

And they have continued now, for three generations, to bury the old with the new.
They do this via drivers. Us 780gtx users caught them out a few times when they first started doing it en masse. The “oops-our mistake” lines didnt work then and they dont sit well with me now.
Truth is since fermi (gtx 400 series) they went with long and short shaders which would require software effort to guide programs to utilise hardware well.
Without driver optimisation teams nvidia hardware works 10-20%worse out the gate (which is what we keep seeing with each next generation hardware release).

Whilst I have owned many top of the line nvidia cards since the riva tnt days- I wouldnt buy one for a friend.
AMD is the only ethicql choice left that gives vastly better value for money over a video cards three+ year life span.

If you can afford (and do buy) a new graphics card every year to eighteen months then nvidia probably suits your needs more. (Slight perf bump @massive cost premium)

Which all gets back to…

Simulating horse hair with collision detection is a heavy hitter by way of math.

As most people playing KCD have not the power to spare… probably not a high priority anyway.

With that in mind- they did add/restore cloth simulation with 1.5 patch so…

Nvidia are just the sort of company to come in at the eleventh hour and make some physx code path for hair simulation, so long as its a black box for code to go into and their fairies can spit out whatever they want.

Previoulsy they built that ‘black box’ to cripple competitors performance, so no trust should be given…

But…

Nvidia has many times added custom trickery to a game at last moment so amd cannot respond in time for game lauch. Nvidia gets some nice stats/slides for luring more developers to them/ whilst AMD figure out the trick and respond…
They get it within a few weeks generally, but by then nvidiots are made happy with bragging rights.
Funny how my replacement amd card does witcher 3 with hairwirxs better than my ngreedia card.

Nvidia- buy more of our cards.
We will pretend you can sli them whilstever it nets us more dollars, but reality is we moved to a new architecture that does VR (two screens with slightly altered viewpoint), and two of those cards dont do sli as well as two of last gens cards (or even the generation before that)… so we will remove sli ability until it helps us sell new cards.
We did the same with geforce 3d (we even stipped the software working and gave it for free to shiw hiw hard it is to get OLDER cards to work) so we know we can get away with it…

Nvidia- ripping consumers hard and being unethical bastards. Do not support this company. Making gaming worse.