• Architeuthis
    link
    fedilink
    English
    arrow-up
    9
    ·
    edit-2
    3 days ago

    I checked it out because I was curious if CEV was some international relations initialism I’d never heard of, turns out its just My Guess About What He Wants in rationalese.

    Excerpt from the definition of Coherent Extrapolated Volition, or how to damage your optical nerve from too much eye rolling:

    Extrapolated volition is the metaethical theory that when we ask “What is right?”, then insofar as we’re asking something meaningful, we’re asking “What would a counterfactual idealized version of myself want* if it knew all the facts, had considered all the arguments, and had perfect self-knowledge and self-control?” (As a metaethical theory, this would make “What is right?” a mixed logical and empirical question, a function over possible states of the world.)

    A very simple example of extrapolated volition might be to consider somebody who asks you to bring them orange juice from the refrigerator. You open the refrigerator and see no orange juice, but there’s lemonade. You imagine that your friend would want you to bring them lemonade if they knew everything you knew about the refrigerator, so you bring them lemonade instead. On an abstract level, we can say that you “extrapolated” your friend’s “volition”, in other words, you took your model of their mind and decision process, or your model of their “volition”, and you imagined a counterfactual version of their mind that had better information about the contents of your refrigerator, thereby “extrapolating” this volition.

    • YourNetworkIsHaunted
      link
      fedilink
      English
      arrow-up
      7
      ·
      2 days ago

      This feels like an attempt to create an ethical framework that supports overruling people’s actual freedom of choice in favor of a technocratic vision of what you should choose, and while I can understand the frustration with people doing dumb shit, the problem comes in when “joining a cult preaching rationality and then trying to avert the robot apocalypse by bringing about a slightly different flavor of robot apocalypse” is, to many educated folks, a pretty strong example of stupid shit people do, while to them “ignore the oncoming robot apocalypse because you’re too irrational to see the obvious truth that we’re all gonna be simutortired by the basilisk forever!” would presumably make the list.

      Also I guess texting your friend to say “Yo we’re out of OJ, is lemonade alright?” is unironically praxis now?

    • CinnasVerses
      link
      fedilink
      English
      arrow-up
      7
      ·
      edit-2
      3 days ago

      It is a bit more than that: CEV is what he would want if he were wiser and less confused. Yudkowsky’s vision was that we want a lot of things which are contradictory or conflict with others or will make us sad, but Friend Computer could sort that out. But talking your friend into going to an event or trying a new food which she actually likes when she tries it is definitely in the spirit.

      • Architeuthis
        link
        fedilink
        English
        arrow-up
        4
        ·
        3 days ago

        CEV is what he would want if he were wiser and less confused

        Isn’t that just steelmanning?

        I gathered the “idealized version of myself” was because it’s supposed to be applied to a superintelligence, because of course it’s an alignment thing.

        • CinnasVerses
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          3 days ago

          Steelmanning is making the best possible argument for a position, whereas CEV is sorting out all the delusions and contradictions in someone’s thinking and giving them what they would want if they were wise enough to know it. Central bankers engage in extrapolated volition when they try to make the economy run in a way that will make people happy, even if what they do is not what the woman on the street wants them to do because the woman on the street has no idea how the economy works. Friends engage in extrapolated volition when they intervene in a marriage or a drinking bout and say “you are ruining your life, and we are stopping it now.” Extrapolated volition is paternalistic (“you think you want that, but I know better …”) and Yudkowsky’s CEV would demand God the Father. Yud’s original paper is available.

          • sleepundertheleaves@infosec.pub
            link
            fedilink
            English
            arrow-up
            6
            ·
            edit-2
            2 days ago

            So, CEV presupposes false consciousness - that the average person’s belief system is misaligned from reality due to incomplete information, incorrect presuppositions, and so forth.

            And the idea is that a wise leader will choose for the people what the people would choose for themselves if they had a correct understanding of reality, whether the people think they want that or not?

            I guess today in LessWrong, we are re-inventing Marxism.

            • corbin
              link
              fedilink
              English
              arrow-up
              7
              ·
              2 days ago

              One must always keep in mind that the Rationalist project is explicitly a high-modernist effort; it is a permanent fight against postmodernism which it can never win, a philosopher’s lost cause. They can only look at Marxism as low art which must be elevated by sanctifying it with the nebulous ointment of “Western civilization”.