The process of learning to really appreciate communism, ... looks a lot like dozens of questions about “but isn’t that an atrocity?” “wouldn’t this inevitably lead to dystopia?”
(https://slatestarcodex.com/2020/03/06/socratic-grilling/)
posted on May 07, 2020 09:01 PM by
u/Soyweiser
26
u/MarxBroshevik39 pointsat 1588895866.000000
The process of learning to really appreciate communism, or
libertarianism, or whatever, coming from a diametrically opposed
philosophy, looks a lot like dozens of questions about “but isn’t that
an atrocity?” “wouldn’t this inevitably lead to dystopia?” and hearing
what your interlocutor has to answer. It’s so, so tempting to round this
off to them trying to gotcha you (as indeed sometimes it will be) and
assume they’re not really committed to trying to understand.
This process would be easier if you didn’t ban communists like myself
from commenting, Scott.
That is just a weird fucking example to pick. I thought politics is
the mind killer meant rationalists are aware that you shouldn’t always
make everything about politics because it causes weird reactions. So why
deliberately pick a fucking political example (and do it like this)…
E: The
subreddit is defending the articles main point, because trolling and
sealioning never happens. Also blaming schooling not being properly
trained in rationalistism causes socratic grilling people to turn into
conspiracy theorists. Rationalism, it now also cures anti-vaxers.
Second, he had a sort of efficient-market-style confusion: germ
theory seems to imply an easy way to eliminate all sicknesses forever,
so why hasn’t someone picked this low-hanging fruit?
What exactly does “efficient-market-style” mean here? I thought
efficient markets were a thing in economics and resource allocation.
If someone thinks, it is free markets. If they don’t then its
communism.
I think it means 'if we let society be run by only rationalists there is an easy solution which the normies can't see or wont use [because they refuse to use the free market/prediction markets, communism, overregulation, the gov, low iq people in power, etc etc]' in this case.
It is the massive faith in the ability of free markets to pick up all the low-hanging fruit and solve problems like that. Of course, to be fair to rationalists, they know the world doesn't work this way, so when they encounter an 'easy solution' which isn't implemented, they go look for the flaw which is causing the free market model to not kick in [see list above]. (A flaw of rationalism is not being able to let go of the 'free markets are best' assumption).
This is also why they love prediction markets, and their smaller component, [making predictions with probabilities](https://slatestarcodex.com/2020/04/29/predictions-for-2020/). The latter is believed to make you better at prediction, which makes predictions a sort of art/skill which you can practice.
> What exactly does "efficient-market-style" mean here?
I think he's trying to connect this to Eliezer's [Inadequate Equilibria](https://equilibriabook.com/)
> I wanted a very clear example—*Bayes says "zig", this is a zag*—when it came time to break your allegiance to Science.
This from a [post](https://www.lesswrong.com/posts/viPPjojmChxLGPE2v/the-dilemma-science-or-bayes) on how he had proven that the "many worlds" interpretation was correct. (Note that there is no experimental evidence distinguishing between interpretations)
Another anti-empiricism quote is the idea that an AI would solve all of physics with no access to experimental data,[just from being really smart](https://www.lesswrong.com/posts/5wMcKNAwB6X4mp9og/that-alien-message):
> A Bayesian superintelligence, hooked up to a webcam, would invent General Relativity as a hypothesis—perhaps not the *dominant* hypothesis, compared to Newtonian mechanics, but still a hypothesis under direct consideration—by the time it had seen the third frame of a falling apple. It might guess it from the first frame, if it saw the statics of a bent blade of grass.
(as someone with a physics degree... this is insane)
You can see the anti-empiricist streak in a number of different places, such as declaring 100% chance that cryonics will work and other absurdly confident predictions of AI apocalypse and whatnot.
And at the end he realises that he has no idea how the Born rule would work in MWI, in a nice story that does a pretty good job at isolating a core technical problem. Unfortunately he then decides that the whole "noticing your confusion" thing is for other people, and that no one could have any legitimate problem with his inability to explain literally any physical measurement ever made.
Someone goes “that’s weird” or “if X was really true, wouldn’t that
imply Y?” and gets hit with “You really think you’re smarter than
everyone else? You really think a random person on the Internet has
discovered a hole in X?” No, sometimes they’re just using Socratic
grilling to expose the contradictions in their model and get somebody to
resolve them.
Rationalism is when you verbosely do Cunningham’s law, “the best way
to get the right answer on the internet is not to ask a question; it’s
to post the wrong answer.”
Which is a bit annoying btw, as both methods waste other peoples time, as they require other people to answer your questions, and not the person with the questions doing their homework first by just searching.
And imagine a kid who does this all the time in class, they will take up so much of the teachers time, it is all pretty inefficient.
Because they feel entitled to other people's time.
There's a meta-level of refusing to research why their "methodology" is terrible. Plenty has been written about this since the 80s or so. Like, how many undergrad papers have started with that Audre Lorde line about the oppressed teaching the oppressors?
> And imagine a kid who does this all the time in class, they will take up so much of the teachers time, it is all pretty inefficient.
I suspect that many rationalists were the kind of kid who tries to monopolize classroom time with snarky and insincere questions.
I think that's fine if everyone involved agrees to the format beforehand. Like how if you're a reporter, you don't ask questions you don't know the answers to yourself
Socrates didn't obtain such agreement, of course, but look how that worked out for him
And the thesis of this post is that you must never, ever say that.
Saying that is so bad. Smack down that student once, say “I think I know
more about germ theory than you do”, make him feel like he challenged
your authority and that’s bad – and the best case scenario is he will
never ask questions to resolve his confusion again.
Hey look! Some teaching 101 stuff. I’m glad Scott is trying to
highlight that approach to his readers, it really helps make
conversations more productive…
I find this to be one of the most frustrating parts of writing this
blog: how do I signal the things I still need to learn without the
Arrogance Police descending on me?
Oh, no, wait, he wants everyone to accept the “efficient”
assholes.
Also, the “good teacher” exchange makes me think he hasn’t spoken to
a real teacher in years. That amount of grovelling is just weird. Even
his “great teacher” example is weird:
With a great teacher, all of this is assumed, and you don’t need the
disclaimers, and you can just say “What? That makes no sense,” and
expect the teacher to try again.
A “great” teacher won’t just “try again”, they’ll ask the student to
explain their confusion. Such an exchange is even possible on the
internet!
These fake exchanges might be a joke, but the fact that he didn’t
name one example between the extremes kinda makes me think it isn’t.
The actual quote, without Soyweiser’s crafty ellipsis: ” The process
of learning to really appreciate communism, or libertarianism,
or whatever, coming from a diametrically opposed philosophy,
looks a lot like dozens of questions about “but isn’t that an atrocity?”
“wouldn’t this inevitably lead to dystopia?”
There’s enough to make fun of in the rationalsphere without lying, if
you’re not lazy.
Well, Scott does have a slight history of not understanding communism, and he is very pro libertarianism, his addition of libertarianism is just an attempt at false centrism.
And I always expect people here to at least follow the links and check up on the quotes, and the high amount of times people go 'wow I thought you were being hyperbolic but they actually said that' shows that people do, it also shows people expect a level of hyperbole here. Anyway, you should still follow and check links every time I post them. It is esp important as there are examples of Scott openly saying [he lies about communism to further his goals to spread fascism](https://en.wikipedia.org/wiki/Lie).
This process would be easier if you didn’t ban communists like myself from commenting, Scott.
That is just a weird fucking example to pick. I thought politics is the mind killer meant rationalists are aware that you shouldn’t always make everything about politics because it causes weird reactions. So why deliberately pick a fucking political example (and do it like this)…
E: The subreddit is defending the articles main point, because trolling and sealioning never happens. Also blaming schooling not being properly trained in rationalistism causes socratic grilling people to turn into conspiracy theorists. Rationalism, it now also cures anti-vaxers.
uwu
What exactly does “efficient-market-style” mean here? I thought efficient markets were a thing in economics and resource allocation.
If someone thinks, it is free markets. If they don’t then its communism.
Why do ‘rationalists’ keep basing arguments on situations they just made up in their heads? At least come up with an anecdote or something, jeez
Rationalism is when you verbosely do Cunningham’s law, “the best way to get the right answer on the internet is not to ask a question; it’s to post the wrong answer.”
Hey look! Some teaching 101 stuff. I’m glad Scott is trying to highlight that approach to his readers, it really helps make conversations more productive…
Oh, no, wait, he wants everyone to accept the “efficient” assholes.
Also, the “good teacher” exchange makes me think he hasn’t spoken to a real teacher in years. That amount of grovelling is just weird. Even his “great teacher” example is weird:
A “great” teacher won’t just “try again”, they’ll ask the student to explain their confusion. Such an exchange is even possible on the internet!
These fake exchanges might be a joke, but the fact that he didn’t name one example between the extremes kinda makes me think it isn’t.
trillion iq ubermensch post locked and loaded and ready to go; read on brave netizen
They should have read my sneerquences to really appreciate any points I make.
The actual quote, without Soyweiser’s crafty ellipsis: ” The process of learning to really appreciate communism, or libertarianism, or whatever, coming from a diametrically opposed philosophy, looks a lot like dozens of questions about “but isn’t that an atrocity?” “wouldn’t this inevitably lead to dystopia?”
There’s enough to make fun of in the rationalsphere without lying, if you’re not lazy.