this post was submitted on 17 Apr 2026
0 points (NaN% liked)
SneerClub
1247 readers
7 users here now
Hurling ordure at the TREACLES, especially those closely related to LessWrong.
AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)
This is sneer club, not debate club. Unless it's amusing debate.
[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]
See our twin at Reddit
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I checked it out because I was curious if CEV was some international relations initialism I'd never heard of, turns out its just My Guess About What He Wants in rationalese.
Excerpt from the definition of Coherent Extrapolated Volition, or how to damage your optical nerve from too much eye rolling:
Extrapolated volition is the metaethical theory that when we ask "What is right?", then insofar as we're asking something meaningful, we're asking "What would a counterfactual idealized version of myself want* if it knew all the facts, had considered all the arguments, and had perfect self-knowledge and self-control?" (As a metaethical theory, this would make "What is right?" a mixed logical and empirical question, a function over possible states of the world.)A very simple example of extrapolated volition might be to consider somebody who asks you to bring them orange juice from the refrigerator. You open the refrigerator and see no orange juice, but there's lemonade. You imagine that your friend would want you to bring them lemonade if they knew everything you knew about the refrigerator, so you bring them lemonade instead. On an abstract level, we can say that you "extrapolated" your friend's "volition", in other words, you took your model of their mind and decision process, or your model of their "volition", and you imagined a counterfactual version of their mind that had better information about the contents of your refrigerator, thereby "extrapolating" this volition.
This feels like an attempt to create an ethical framework that supports overruling people's actual freedom of choice in favor of a technocratic vision of what you should choose, and while I can understand the frustration with people doing dumb shit, the problem comes in when "joining a cult preaching rationality and then trying to avert the robot apocalypse by bringing about a slightly different flavor of robot apocalypse" is, to many educated folks, a pretty strong example of stupid shit people do, while to them "ignore the oncoming robot apocalypse because you're too irrational to see the obvious truth that we're all gonna be simutortired by the basilisk forever!" would presumably make the list.
Also I guess texting your friend to say "Yo we're out of OJ, is lemonade alright?" is unironically praxis now?
It is a bit more than that: CEV is what he would want if he were wiser and less confused. Yudkowsky's vision was that we want a lot of things which are contradictory or conflict with others or will make us sad, but Friend Computer could sort that out. But talking your friend into going to an event or trying a new food which she actually likes when she tries it is definitely in the spirit.
Isn't that just steelmanning?
I gathered the "idealized version of myself" was because it's supposed to be applied to a superintelligence, because of course it's an alignment thing.
Steelmanning is making the best possible argument for a position, whereas CEV is sorting out all the delusions and contradictions in someone's thinking and giving them what they would want if they were wise enough to know it. Central bankers engage in extrapolated volition when they try to make the economy run in a way that will make people happy, even if what they do is not what the woman on the street wants them to do because the woman on the street has no idea how the economy works. Friends engage in extrapolated volition when they intervene in a marriage or a drinking bout and say "you are ruining your life, and we are stopping it now." Extrapolated volition is paternalistic ("you think you want that, but I know better ...") and Yudkowsky's CEV would demand God the Father. Yud's original paper is available.
So, CEV presupposes false consciousness - belief systems which are misaligned from reality due to material, ideological, ot institutional processes (or whatever). And the idea is that a wise leader will choose for the people what the people would choose for themselves if they had a correct understanding of reality, whether the people think they want that or not?
I guess today in LessWrong, we are re-inventing Marxism.
One must always keep in mind that the Rationalist project is explicitly a high-modernist effort; it is a permanent fight against postmodernism which it can never win, a philosopher's lost cause. They can only look at Marxism as low art which must be elevated by sanctifying it with the nebulous ointment of "Western civilization".