posted on September 26, 2018 09:14 PM by
u/DELETED
6
i just found this sub
u/DELETED51 pointsat 1537997961.000000
Capital R Rationalism is a quasi-religious movement centered on Bay
Area autodidact grade-school dropout Eliezer Yudkowsky’s mostly
plagiarized ideas, such as that an evil AI is certainly going to be
created, and we have to do everything in our power to stop it (give
money to Eliezer Yudkowsky). Another big rationalist figure is Scott
Alexander, a Bay Area psychiatrist who writes a blog where he writes
very looooong posts about things he knows nothing about. His fans love
to have honest conversations about black people’s skull shapes.
Generally common traits of rationalist figures and their fans seems to
be a gigantic persecution complex and more often than not fear and
resentment of women.
Rationalism without a capital R is some concept in epistemology but I
wouldn’t know that because I’m not a nerd lmao
There’s rationalism-the-philosophical-concept (Wikipedia,
Stanford
Encyclopedia of Philosophy); that’s not what we’re about. This sub
is for mocking a particular social circle calling themselves
“rationalists”— filled with a lot of stupid, awful people— which emerged
out of a bunch of blog posts from an “autodidact” drumming up money for
his fake research institute. Here’s
my Cliffs Notes version to get you up to speed. Happy trails.
In terms of modern philosophy, rationalism is a term - coined after
the fact and applied retroactively - to describe the epistemological
view that knowledge, truth, justification, etc. is primarily grounded in
theoretical reason in contrast to experience. Though the view has been
used to describe even older views of antiquity, it refers to views of
modern philosophers such as René Descartes, Nicolas Malebranche, Baruch
Spinoza, and Gottfried Leibniz. Rationalism in this sense is fairly rare
to find in philosophy after Immanuel Kant’s synthesis, at least without
qualification.
Rationalism in the sense that /r/SneerClub sneers at is the name
adopted by a community that, at least originally, formed around an
interest in strategies to improve thinking rationally in a less strict
sense, i.e. overcoming common biases, application of Bayesian
probability to problem-solving, etc. There’s no connection, at least
that I’m aware of, between this and the epistemological view of modern
philosophy, instead connoting a greater appreciation for rationality in
general.
Effective Altruism
EA is a movement that, as the name implies, attempts to determine the most effective means to help others using rational and empirical methods. It originated and continues to be mostly, as far I know, separate from the rationalist community in that second sense above. Per the interest of the rationalist community, EA in their hands tends toward concern about speculative future super artificial intelligence.
> It originated and continues to be mostly, as far I know, separate from the rationalist community in that second sense above.
In theory and in original principles. In practice, the "rationalist community" has taken a (vastly) disproportionately large role in the various EA organizations, which increasingly reflect "rationalist community" commitments in their decisions and practices. Not uncommonly, the result is in explicit opposition to the theory and original principles of EA. For example, EA in its original guise has taken a relatively hardline stance against (a reasonable person having confidence in the charitable efficiency of) organizations like MIRI--see, for example, the extremely negative GiveWell assessment of MIRI. Yet, a significant proportion of EA organizers recommend giving, sometimes exclusively, to MIRI and similar organizations, entirely in spite of any available measures supporting the effectiveness of it as a charity.
Small-r rationalism is the abstract concept of human thought being
guided by reason, thought, and materialistic evidence over instinct,
tradition, and gut feeling.
Rationalism is what happens when pop science peddling self-help gurus
take that abstract concept and turn it into an ideology.
Capital R Rationalism is a quasi-religious movement centered on Bay Area autodidact grade-school dropout Eliezer Yudkowsky’s mostly plagiarized ideas, such as that an evil AI is certainly going to be created, and we have to do everything in our power to stop it (give money to Eliezer Yudkowsky). Another big rationalist figure is Scott Alexander, a Bay Area psychiatrist who writes a blog where he writes very looooong posts about things he knows nothing about. His fans love to have honest conversations about black people’s skull shapes. Generally common traits of rationalist figures and their fans seems to be a gigantic persecution complex and more often than not fear and resentment of women.
Rationalism without a capital R is some concept in epistemology but I wouldn’t know that because I’m not a nerd lmao
There’s rationalism-the-philosophical-concept (Wikipedia, Stanford Encyclopedia of Philosophy); that’s not what we’re about. This sub is for mocking a particular social circle calling themselves “rationalists”— filled with a lot of stupid, awful people— which emerged out of a bunch of blog posts from an “autodidact” drumming up money for his fake research institute. Here’s my Cliffs Notes version to get you up to speed. Happy trails.
There is no “capital-R Rationalism.”
In terms of modern philosophy, rationalism is a term - coined after the fact and applied retroactively - to describe the epistemological view that knowledge, truth, justification, etc. is primarily grounded in theoretical reason in contrast to experience. Though the view has been used to describe even older views of antiquity, it refers to views of modern philosophers such as René Descartes, Nicolas Malebranche, Baruch Spinoza, and Gottfried Leibniz. Rationalism in this sense is fairly rare to find in philosophy after Immanuel Kant’s synthesis, at least without qualification.
Rationalism in the sense that /r/SneerClub sneers at is the name adopted by a community that, at least originally, formed around an interest in strategies to improve thinking rationally in a less strict sense, i.e. overcoming common biases, application of Bayesian probability to problem-solving, etc. There’s no connection, at least that I’m aware of, between this and the epistemological view of modern philosophy, instead connoting a greater appreciation for rationality in general.
Small-r rationalism is the abstract concept of human thought being guided by reason, thought, and materialistic evidence over instinct, tradition, and gut feeling.
Rationalism is what happens when pop science peddling self-help gurus take that abstract concept and turn it into an ideology.
One is a cult, the other is NotAnArgument