… while at the same time not really worth worrying about so we should be concentrating on unnamed alleged mid term risks.

EY tweets are probably the lowest effort sneerclub content possible but the birdsite threw this to my face this morning so it’s only fair you suffer too. Transcript follows:

Andrew Ng wrote:

In AI, the ratio of attention on hypothetical, future, forms of harm to actual, current, realized forms of harm seems out of whack.

Many of the hypothetical forms of harm, like AI “taking over”, are based on highly questionable hypotheses about what technology that does not currently exist might do.

Every field should examine both future and current problems. But is there any other engineering discipline where this much attention is on hypothetical problems rather than actual problems?

EY replied:

I think when the near-term harm is massive numbers of young men and women dropping out of the human dating market, and the mid-term harm is the utter extermination of humanity, it makes sense to focus on policies motivated by preventing mid-term harm, if there’s even a trade-off.

  • @swlabr
    link
    English
    911 months ago

    To answer the original question, it’s as simple as the people working on AI are unable to grasp the near term risks (e.g. deepfakes, labor devaluation, climate change from energy use), so they focus on the fun, sci-fi “long term” issues.

    So then ofc we have yud here on his usual bullshit talking about some made-up problems that only his giant brain can confabulate.

    • @GorillasAreForEating
      link
      English
      1111 months ago

      No, they’re able to grasp the near term risks, they just don’t want that to get in the way of making money because they know they’re unlikely to be affected.

  • @Soyweiser
    link
    English
    9
    edit-2
    11 months ago

    I think when the near-term harm is massive numbers of young men and women dropping out of the human dating market

    Ow god he is going blackpiller. Kudos to this seasons writers, unexpected twist.

    Small story I read about the ‘guy who was convinced by his AI replika gf to try and kill the queen’ the AI girlfriend basically went with stuff like ‘that sounds like a great idea’ and that was all the convincing it did. The sexting logs (apparently, I have not checked myself) which came out when Replika turned off the AI’s sex responses were also very passive on the AI’s side (For people interested the Sarah Z yt vid on it might be interesting but I have not watched it yet). They are not looking for dates, they are looking for slaves and supportive moms, the human dating market will do fine. The only risk is people pigbutchering, but we don’t need AI for that (and the human slaves they use for that (yeah really, not fun to look that up) are also doing good work, prob better than any AI can do, with the human people being actually human).

    Anyway, I still have the feeling that when a social movement is lagging they start looking into dating sites so not sure how great this is for the future of AI (I’m joking, clearly this is a different situation involving dating).

  • @carlitoscohones
    link
    English
    811 months ago

    I think that, if you assume the consequent, my slippery slope argument is valid.

    • @gerikson
      link
      English
      10
      edit-2
      11 months ago

      So he wrote that in 2007. Since then, games have only gotten more immersive according to his definition, so people dying of too much gaming should be a massive issue. As far as I know, it is not. People can fuck their lives up in other ways, but arguably straight up gambling is worse as it draws off way more real money from people that could have gone to education, housing etc.

      Yud likes to argue from first principles (obviously), but doesn’t reckon on social dynamics. If games were as bad as he describes, there would be regulation around them. Presumably if AI girlfriends become a threat to future pension payments, they will be regulated also.

      • David GerardMA
        link
        English
        8
        edit-2
        11 months ago

        see also the similar deleterious social effects of chess addiction in history (mostly as part of bans on gambling)

        • @gerikson
          link
          English
          1111 months ago

          Their crippling addiction

          My worthy pastime

          • @locallynonlinear
            link
            English
            711 months ago

            Completely unrelated, but Everytime I see your avatar in the tiny minimized form I see Squidward’s face, and then your comments get 20% more amusing.

            • @200fifty
              link
              English
              411 months ago

              Oh man, I won’t be able to unsee this, lol

              • @gerikson
                link
                English
                411 months ago

                Me neither. Poor Elden Ring jellyfish!

                • @Soyweiser
                  link
                  English
                  411 months ago

                  Do you feel attacked? Because now is the time to switch to the red jellyfish.