Romulus, on 11 September 2018 - 08:29 AM, said:
I am sorry, but most of the time when you try speaking hardware, you provide misinformation. The original PhysX calculation chip was developed by a company named NovodeX, which was later acuqired by AGEIA. AGEIA also acquired Meqon, the same Meqon Physics that supposedly got built into DNF's Unreal Engine implementation. AGEIA had dedicated PhysX cards that worked with your discrete graphics card regardless of brand. But after nvidia acuqired AGEIA, nvidia started using GPGPU aka CUDA to accelerate AGEIA's PhysX SDK.
Processing PhysX on the CPU even to this date is not faster than processing it on a hardware that supports it natively. Even on a 8700K, if you play Borderlands with PhysX turned on, it takes a significant performance hit. And PhysX settings can't be set to high on the CPU because nvidia wanted this to be proprietary and judging by the lackluster performance CPUs provided even with the PhysX settings to low, it's justified.
I could have sworn NVIDIA made there own NVIDIA branded physics board's at one point, but ill stand corrected on that single point. On a couple games I've worked on we tried off loading PhysX work over to compute shaders and the performance gains were so insignificant(and sometimes slower) compared to the same work properly multithreaded on the CPU. So unless your running a giant universe simulation(like the Universe VR game), you don't need physics on dedicated hardware for the level of physics required for most games.
EDIT:
I can't speak for Borderlands(it's possible that title wasn't well optimized for PhysX), but there are only a few usecases for wanting physics sim on the hardware. From a raw computation perspective, you could optimize the sim to run in compute, but at least on PC, you have to send the data down to the hardware, wait for it, and read it back, the gains you get are simply lost during that process. However if you have something like a cloth sim, and you want to deform the vertices against the environment in real time, then you might have something, but the usecase for stuff like that is for one off cinematics, and you can simply bake the deformations, and the GPU time wasted on the sim isn't worth it. On pretty much every game I've worked on the CPU is more starved for work then the GPU is, and having physics on the CPU is well worth it, which leaves the GPU to do other things. So I admire Jan for wanting to put more shit on the CPU, but hes going about it all wrong IMO

.