r/SneerClub archives
newest
bestest
longest
Is the general idea of "we should be more rational" unsalvageable? (https://www.reddit.com/r/SneerClub/comments/poozsu/is_the_general_idea_of_we_should_be_more_rational/)
48

And what would I do if I actually wanted to act more rationally, rather than LARPing AGI?

I have an advanced STEM degree, I know how to read papers, but lord, there’s just so much, and it’s all so bleak, and I see people who are educated as I am “doing their own research,” and I just lose any hope of being able to make my own, truly informed decisions.

What do?

[deleted]

I disagree with one part here; “superficially clever” is actually a good strategy in some ways. We are built for heuristics. The trick is, the big personalities among the rats tend to pick up carnival barker and cult leader techniques to short circuit things, rather than, on balance, helping people learn good heuristics for actually spotting the BS. They aren’t as bad as the JP/IDW folks but they are still mostly pretty garbage. Ultimately though, no one has time to like literally evaluate carefully every single thing that comes into view. You need to have some like, handy superficial rules for whether shit is worth your time. And one thing the rationalists *don’t* have (especially of the slatestar persuasion) is a good surface filter for whether to tune stuff out because it is obviously garbage designed to shift the Overton window and drag you into taking fascism and racist bs seriously when you could spend your time on *literally anything else*.
I’m biased because of [the whole Dominic Cummings](https://irrationallyspeaking.home.blog/2019/10/22/dominics-basilisk-part-1/) thing in my native country, so the JP/IDW types are less evil to me because they’re not as politically effective in my arena as the environs of so-called rationalism

If reading papers isn’t your job and you find it “bleak” then stop reading them, and focus on your happiness instead. The people who become knowledgeable in a subject are the ones who research it out of genuine interest/passion/skin in the game, not out of pressure to keep up with some arms race of “doing your own research”.

there are of course caveats to what I'm about to say, but the entire idea of expertise is that we are outsourcing our time to people who also value the process and rigor of the scientific method. we trust them to do their work with a certain level of diligence. that way once theyve amassed enough data and come to a conclusion based on years of data and experiments, we can read summaries of those findings instead of having to delve into every single paper all the time. things can go wrong - there are bad players out there, honest mistakes, hell even card houses of theories can come down at any time.... but this is the best system we've devised to tackle these problems. the idea that an online interest group (generally without funding and networks of other experts) are going to divine better conclusions is batshit. not only that but they are giving the rest of us a bad name. if "rationalists" had drawn the line at something like a book club this sub probably wouldn't exist. all I see is hubris and unbridled opportunism edit/addition: if they ever do manage to come up with anything novel or groundbreaking, it will speak for itself...but what have they accomplished besides a giant online circle jerk?
What if I end up like the covid deniers who "did their own research", though?
[deleted]
I think I just have an anxiety disorder.
I wouldn’t worry about such hyperbolic speculations if I were you, just get on with your life and try to be thoughtful: this isn’t that complicated
[deleted]
I'm confident in my medication stack. Books'd be nice.

The thing to understand is that ‘rationality’ is not a property of individuals or of certain styles of thought, but a product of well-calibrated systems of knowledge production/dissemination. No amount of a priori reasoning is going to equip you to evaluate the reliability of truth claims from scientific fields whose methodologies you are unfamiliar with, and knowledge is so specialized nowadays that you don’t have enough time in your life to become an expert in everything; at some point it will always become a matter of placing your trust in the right people and that’s not just an intellectual skill but also a social skill. Recognizing who is a trustworthy source of information and who is a hack or a grifter isn’t a skill you can learn from a book or any kind of clever trick or rule. It simply comes from basic scientific/media literacy (which it seems like you’ve already got) combined with time and experience. It’s only from there that you can come to synthesize some kind of a worldview out of which you can make ‘your own, truly informed’ decisions. Because nobody can be truly informed about everything and there’s no cheatcoding your way to enlightenment.

what does it mean though? pursuing good science? formalizing and optimizing your own behaviors? neither of these paths exist in some inherently “rational” universe. “rationality” is an abstraction, a human invention with entirely subjective considerstions. science has been twisted into a kind of pseudo religion that sadly even many highly educated people buy into, probably because of the associated reactionary politics involved.

Believing what is true and disbelieving what is not true.
this is exactly what im talking about. theres a difference between presenting an empirical argument/supporting or rejecting a hypothesis and using scientific language as the mystical aesthetic for one's faith. the latter is what rationalists and many others seem to do. frankly science does not and should not supply the comfort of truth people crave from it. it is an enterprise not an endpoint.
Do you believe truth exists?
there is what is reproducible and predictable based on the hard won models of the natural world we have managed to amass. absolute truth is sorta beside the point.
Well now you’re stepping on my field, and you’re wrong
buddy, they wont even let me fuck it
Observation #1: your brain has limited storage. Observation #2: you can generate infinitely many true statements and fill your entire brain with them. I don't just mean things like 2+2=4, but pub trivia, names of capitals of countries, historical dates, planets' mass, other encyclopedic facts. Do that, and you believe mind-boggling amount of true things. What of it? You can go further and buy every book that debunks common misconceptions, you can start with *[The Book of General Ignorance](https://en.wikipedia.org/wiki/The_Book_of_General_Ignorance)*. Congratulations, now you believe significantly fewer untrue things, than the average person. What now? See, there's 24 hours in a day, third of which we sleep (and should, sleep is good for you), another third is spent trying to make money to survive. And we also need to eat, shit, socialize, relax, take walks, play games, and do so many other things too. So no matter how you look at it, you'll never achieve every achievement, prove every theorem, learn every fact, be perfectly rational about everything. The problem with *both* LW and EA crowd is that they all have a messiah complex. They genuinely believe that they are the only people truly trying to save the world, the only ones going rationally about it, that the entire planet's future depends on them only. I've been on the inside—these people don't need rationality, they need therapy, and rationality only makes them sicker. Truth is, knowledge is extremely complicated and messy. It has high Kolmogorov complexity, if you will. So the best thing to go about it is to find a niche that matters to *you* personally, not to some abstract utilitarian calculator, but to *you*. And then become real good at this particular thing. And have fun while doing too. You'll realize very quickly that all true knowledge is contextual and domain-specific, and the entire premise of LW is a cruel joke.

The key here is humility.

One person cannot know all. Rationality is not just knowing how to evaluate evidence for yourself. It is also about knowing how to evaluate motives and credibility for yourself.

This is where the worst parts of internet rationalism often fail - they are so focused on STEM that they forget the lessons we all learned in History class. In order to operate effectively in this world, we all need the ability to evaluate sources of information (e.g. motives, perspective, etc…), which allows us to differentiate good sources from bad ones.

A historian cannot go back in time and “read the paper” when it comes to historical events. Instead they have to learn how to decide whether or not they can trust a certain piece of evidence, like an ancient book written by someone who claimed to be there.

In the same way, you or I cannot read every paper in the world. Instead we have to learn how to decide whether or we can trust a certain source, like the Washington Post, or CNN, or in the case of vaccines, the WHO, so that we can use them to inform us of things we don’t have time to study ourselves. We can’t read every study ourselves; instead we have to learn how to critically evaluate sources that claim they have done the reading.

Knowing what sources to trust and when to trust them is the key ability that COVID/Climate-deniers lack, and that anybody who desires to be rational should cultivate.

they don't even do the STEM well alot of the time imho. they would get eye rolls in graduate level journal clubs.

you need to embrace being stupid. Just admit that you are a huge dumbass.

where are my damn freaks.. where`s my nasty Posters. where are my fuckin doofuses

I think the concept of rational thinking is good, but what it’s morphed into in that subculture is being a terrible human being while trying to sound like Mr. Spock.

I think most of the problem is that people have no idea what rational thinking and critical thinking actually are, so they’re easily impressed with Spock using big words. They think that simply identifying the canonical logical fallacies is an I win button for debates. It doesn’t work, and worse makes people believe a guy who argues that way no matter what he’s actually saying.

I was freaking out about philosophies of truth (correspondence etc.) & my wife said something I think is pretty insightful: “No, you don’t get to stop worrying about being wrong until you die.”

There’s no standard or method that can eliminate the possibility of being disastrously, hilariously, despicably wrong. There is no royal road to understanding the “real” shape of the world, no guarantee that what you know will remain true, & no end to the work of integrating new information to figure out what the hell is really going on.

I think the LW “rationality” misstep is assuming there’s some set of principles or framework of understanding that can put someone in a different boat. Nobody isn’t trying to be less wrong, to be responsive to evidence, to value truth over their own prejudice - pretending these values are unique to some subculture is just vanity. We’re all trying to be more rational our whole lives, & reading The Sequences™ is not a prerequisite to that endeavor.

In my opinion, yes. I’m not saying we shouldn’t want to continuously improve how we form our beliefs and how we pursue our goals, but I think focusing on “rationality” the way modern rationalist communities have gone about it, with all the naive philosophical baggage sneaked into it, is broken.

EDIT: Again in my opinion, the alternative to such rationalism is less reliance on ahistorical “calculations” and more import placed on historical/contextual erudition, pluralist critical thinking, social pragmatism, and a lot more epistemic humility.

I think it’s important to accept that the world is such a huge, complicated, and ancient place that you will never grasp all of it with rationality. But you have other faculties as well. Emotions, intuitions - these are not barriers to rational thought but complements to it. The other thing is that rationality may well be domain-limited in important ways, meaning that rational abilities in one subject don’t always carry over. If you want to think rationally about some subject, educate yourself on it while trying to maintain an open mind. If you want to think more rationally in general, I’d suggest you’re better off trying to become wiser, more well-rounded, kinder, and more moral rather than more rational. I figure you want to make truly informed decisions because you want to be a better person in some way - sometimes that’s more an art than a science. The first step is always to get right with yourself and accept your limitations.

Seconding here on the importance of emotions! Something that seems very obvious to me as an outside observer of rationalism is this deep-rooted fear of emotions and the body. Your physical existence has powerful effects on your brain's ability to brain, and to people who absolutely rely on the notion of their brain's superiority, I imagine that reality is actually very distressing. I think it's important to recognize that your emotions and your physical state are part of your human experience, and as OP mentioned above having/suspecting an anxiety disorder, they may have some difficulty with grounding/connecting to their physicality and emotions. There's about as many remedies for this as there are humans on the Earth, OP, but in combination with the support of a counselor I'd recommend finding a physically grounding hobby (dancing, gardening, knitting, kickboxing, woodworking, etc) and engaging regularly with an art form that's meaningful to you (music, dance, painting, poetry, etc). You may not think that this is helping you become a more rational person, but you wouldn't expect a car to perform at its best if all you ever do is change the tires, or a garden to turn out good produce if all you ever do is water it. Anchoring more firmly in your physical world can help you combat the feelings of bleakness, hopelessness, and paralysis that you're describing—and that will in turn help you think more clearly.

I think it depends on what you apply the concept of “be more rational” to.

Personally, I think that the “irrational” part of us provides extremely important insight and additions to our lives. And that in order to be a happy, healthy, functioning member of a community, we 100% need emotion to be a major part of our perspectives and interactions.

On the other hand, if you’re trying to figure out cold fusion, by all means, strive for the coldest, most brutal rationality in all aspects of that search.

Then again, I know almost nothing about the STEM world. So who knows, maybe science and tech folks get great inspiration from emotional experiences and irrational thinking? What do I know?

From what I can tell from stories, even in math research gut feelings usually are important, and it often isn't just plain rational or logical. It just provides the stars which are later worked out.
am theoretical physicist, can confirm If you have enough enthusiasm for your research to work on it at all, you're full of aspirations and intuitions and proclivities for guessing. You try things because you want them to work out. You explore path after path, most of them turning out to be blind alleys. You cook up ideas in the small hours of the night, and hope they survive the light of day.

Elizabeth Sandifer noted in Neoreaction a Basilisk that the halting problem implies that it is literally impossible to know whether you’re onto an idea that will work out well … or just chasing down a rabbit hole that will waste your time forever.

[deleted]

Book recommendations'd be nice. I'm a math nerd so I want to read Jaynes' book on probability just because I like probability.

Okay let me help you out with something here, in order for any of what you are asking to make sense in the first place.

You need to start with goals, chief. Your personal goals.

And the goal can’t be “be more rational in how you act”. It has to be something like what you want to accomplish in your life. Do you have a specific job you hope to get? Are you wanting to write a novel one day? Is your ambition to get married and have kids? Once we have that specified (and you don’t have to tell me what your goal is, or goals are) then we can make sense of how you can be more rational. Because then there are more or less rational ways to pursue the goals.

Now you might want to say “maybe my goals aren’t rational?!” You can figure that out too. Maybe you want to be an engineer because you think it will make you happy. Then you might want to figure out if that’s true, because if not, then it’s not very rational to pursue that goal.

But just pursuing increased rationality in the abstract isn’t helpful to you. Rationality is an instrumental good. The real value is the thing you are going to use it to get. So what is that? If you know, then you just want to figure out how well you can achieve your goals with how rational you already are, and how much you already know, and “optimize” your rationality later if needed.

It’s a fallacy to think (greater) education or intelligence confers rationality. Regardless, the cesspit of online rationalism has repeatedly outed itself as “my feels” but spelled out in needlessly strained, overelaborate jargon 99.9% of the time. You might as well label yourself a scientologist.