cross-posted from: https://sh.itjust.works/post/33099518

TLDR: NVIDIA removed support for PhysX with the 50 series GPUs, resulting in worse performance with PhysX games than previous GPU generations

    • @wheeldawg@sh.itjust.works
      link
      fedilink
      English
      5
      edit-2
      1 month ago

      Wow. I probably have played 4 or 5 on that entire list. And none of them in the past 5 or so years.

      It’s still a shitty thing to do for sure. Maybe there will be a new “thing” that starts getting used instead? Ray tracing has gotten way more coverage than PhysX ever did, and imo is like 3% as good or interesting.

      Physics actually have gameplay interactions that matter. Ray tracing looks nice, but is so absolutely expensive computationally that (imo) is not even CLOSE to being worth the effort if turning on, even with compatible hardware.

      Give us better physics, games! My main time sink rn is Rocket League, and that game is literally nothing but physics. Mostly simple physics, but stuff behaving in a logical way makes my brain a lot happier than better lighting ever did.

      I like when y’all grass became an actual object that could be moved around by players, or when tossing an item on the ground actually does it tossed down and colliding with other objects while texting to them appropriately (as in fire starting, or weight holding something down a certain amount). That stuff is potentially game creating, definitely feature drinking.

      Has anything AT ALL been affected by “pretty lights” beyond making them pretty? If it has, I’ve never heard of it.

      Keep games about a gameplay experience, not just a visual feast. Save that tech for movies or playable stories (ie Telltale type). Focus only on the gameplay experience otherwise. Toss in some ray tracing when you can, but NEVER at the expense of physics. It just doesn’t make any sense.

  • @AdrianTheFrog@lemmy.world
    link
    fedilink
    English
    131 month ago

    Are there really any 32-bit era games that your CPU can’t handle, especially if you have a $1k+ gpu? This post is honestly pretty misleading as it implies modern versions of PhysX don’t work, when they actually do.

    That being said, it doesn’t make all that much sense as a decision, doubles are rare in most GPU code anyways (as they are very slow), NVIDIA is just being lazy and doesn’t want to write the drivers for that

    Well, at least you aren’t on mac where 32 bit things just don’t launch at all… (I think they might be playable through wine, but even in the x86 era MacOS didn’t natively run any 32 bit games or software, so games like Portal 2 or TF2 for example just didn’t work even though they had a MacOS version)

    • @cheesorist@lemmy.world
      link
      fedilink
      English
      191 month ago

      mirrors edge drops to under 10 fps when breaking glass which generates physx objects… with a 9800x3d.

      the current physx cpu implementation is artificially shit, the cpu can easily handle it nowadays but it depends on skilled community members or nvidia themselves to unshit it.

      • @AdrianTheFrog@lemmy.world
        link
        fedilink
        English
        21 month ago

        Hmm, I was not aware of that. I’ve seen (not Nvidia related) simulations with probably tens of thousands of rigidbodies running on relatively old midrange CPUs in real time, so it’s pretty crazy that it’s that slow.

    • @GoodEye8@lemm.ee
      link
      fedilink
      English
      21 month ago

      You never know when old games just don’t work. For example I recently tried to play deus ex mankind divided. I have new hardware but I had to play on medium settings because anything higher would start killing performance despite the game being 5 years older than my hardware.

      I wouldn’t be surprised if some older games ran like shit on the 50 series cards whenever physx is concerned.

  • @hark@lemmy.world
    link
    fedilink
    English
    121 month ago

    It’s too bad the CPU path for PhysX is crappy. It would be a good use of the many cores/threads we have available to us these days.

    • @interdimensionalmeme@lemmy.ml
      link
      fedilink
      English
      21 month ago

      They laser off the vcpu feature from the chip just so you can’t use it at the same time as another family member. They spend extra money to make it worse.

        • @interdimensionalmeme@lemmy.ml
          link
          fedilink
          English
          129 days ago

          They use the same silicon silicon for datacenter products Vcpu is something that allows a gpu to be used by multiple virtual machines at the same time.

          They use a laser to break this part before selling it to you, so that you and your sister can’t share the same card at the dame time and instead have to buy two.

  • _cryptagion [he/him]
    link
    fedilink
    English
    51 month ago

    I’ve had enough of NVIDIA to the point I’m not planning on playing anything on one of their GPUs ever again.

  • kingthrillgore
    link
    fedilink
    English
    31 month ago

    Nvidia got what it wanted from Ageia when they bought PhysX, and that was improvements to CUDA.