• yeehaw
    link
    fedilink
    English
    482 months ago

    nvidia has been a garbage company for a few generations now. They got to the top, and sat up there enshitifying everything because they have a monopoly on the market. Don’t buy into this shit.

    • sunzu2
      link
      fedilink
      182 months ago

      If you got hardware, use it until it dies…

      Fucking 1070 can still put out decent performance lol

      • yeehaw
        link
        fedilink
        English
        42 months ago

        2070s here, doesn’t work that well with AAA (or even AAAA games! /s) that well with 3440x1440 resolution :S. But I can easily survive turning the graphics down.

          • yeehaw
            link
            fedilink
            English
            32 months ago

            Haha not so successful when your fps is total garbage

      • @Sturgist@lemmy.ca
        link
        fedilink
        English
        32 months ago

        Running a 1060 in my desktop. Still does absolutely fine. I got my buddy’s old 1080ti OCd, just waiting to get the water-cooling kit put together and the 1060 will get put in my media sever to take over transcoding for the old 980.

  • @ramble81@lemm.ee
    link
    fedilink
    English
    102 months ago

    Okay so the “S” models stood for “Super” which was a slight step up from the base. What are “D” models? “Duper”?

  • LostXOR
    link
    fedilink
    92 months ago

    Guess we’ll have to see how they handle this. Are they going to be good and do a full recall, or pull an Intel and do everything they can to avoid it?

  • tehWrapper
    link
    fedilink
    English
    92 months ago

    It feels like things are so powerful and complex that failure rates of all these devices is much higher now.

    • Rekall IncorporatedOP
      link
      fedilink
      English
      102 months ago

      I don’t have any stats to back this up, but I wouldn’t be surprised if failure rates were higher back in the 90s and 2000s.

      We have much more sophisticated validation technologies and the benefit of industry, process and operational maturity.

      Would be interesting to actually analyze the real world dynamics around this.

      • @GrindingGears@lemmy.ca
        link
        fedilink
        English
        3
        edit-2
        2 months ago

        Not very many people had a dedicated GPU in the 90s and 2000s. And there’s no way the failure rate was higher, not even Limewire could melt down the family PC back then. It sure gave it the college try, but it was usually fixable. The biggest failures, bar none, were HD or media drives.

        • @TacoSocks@infosec.pub
          link
          fedilink
          English
          22 months ago

          Dedicated GPUs were pretty common in the 2000s, they were required for most games, unlike the 90s where it was an unstandardized wild west. The failure rate had to be higher, I know I had 3 cards die with less than 2 years use on each card in the 2000s. Cases back then had terrible airflow and graphic demands jumped quickly.

      • tehWrapper
        link
        fedilink
        English
        12 months ago

        I am going to guess the amount made is also much higher than 90s and 2000s since hardware tech is way more popular and used in way more places in the world. So maybe a lower percent but just a high total amount.

        But I have no idea…

    • @GrindingGears@lemmy.ca
      link
      fedilink
      English
      22 months ago

      You are just short of needing a personal sized nuclear reactor to power these damn things, so I mean the logic follows that the failure rate is going to climb

    • NutWrench
      link
      fedilink
      English
      12 months ago

      Yup. This nonsense (pcie-5 burnout) should have been detected immediately during quality control.