And what would I do if I actually wanted to act more rationally, rather than LARPing AGI?
I have an advanced STEM degree, I know how to read papers, but lord, there’s just so much, and it’s all so bleak, and I see people who are educated as I am “doing their own research,” and I just lose any hope of being able to make my own, truly informed decisions.
What do?
[deleted]
If reading papers isn’t your job and you find it “bleak” then stop reading them, and focus on your happiness instead. The people who become knowledgeable in a subject are the ones who research it out of genuine interest/passion/skin in the game, not out of pressure to keep up with some arms race of “doing your own research”.
The thing to understand is that ‘rationality’ is not a property of individuals or of certain styles of thought, but a product of well-calibrated systems of knowledge production/dissemination. No amount of a priori reasoning is going to equip you to evaluate the reliability of truth claims from scientific fields whose methodologies you are unfamiliar with, and knowledge is so specialized nowadays that you don’t have enough time in your life to become an expert in everything; at some point it will always become a matter of placing your trust in the right people and that’s not just an intellectual skill but also a social skill. Recognizing who is a trustworthy source of information and who is a hack or a grifter isn’t a skill you can learn from a book or any kind of clever trick or rule. It simply comes from basic scientific/media literacy (which it seems like you’ve already got) combined with time and experience. It’s only from there that you can come to synthesize some kind of a worldview out of which you can make ‘your own, truly informed’ decisions. Because nobody can be truly informed about everything and there’s no cheatcoding your way to enlightenment.
what does it mean though? pursuing good science? formalizing and optimizing your own behaviors? neither of these paths exist in some inherently “rational” universe. “rationality” is an abstraction, a human invention with entirely subjective considerstions. science has been twisted into a kind of pseudo religion that sadly even many highly educated people buy into, probably because of the associated reactionary politics involved.
The key here is humility.
One person cannot know all. Rationality is not just knowing how to evaluate evidence for yourself. It is also about knowing how to evaluate motives and credibility for yourself.
This is where the worst parts of internet rationalism often fail - they are so focused on STEM that they forget the lessons we all learned in History class. In order to operate effectively in this world, we all need the ability to evaluate sources of information (e.g. motives, perspective, etc…), which allows us to differentiate good sources from bad ones.
A historian cannot go back in time and “read the paper” when it comes to historical events. Instead they have to learn how to decide whether or not they can trust a certain piece of evidence, like an ancient book written by someone who claimed to be there.
In the same way, you or I cannot read every paper in the world. Instead we have to learn how to decide whether or we can trust a certain source, like the Washington Post, or CNN, or in the case of vaccines, the WHO, so that we can use them to inform us of things we don’t have time to study ourselves. We can’t read every study ourselves; instead we have to learn how to critically evaluate sources that claim they have done the reading.
Knowing what sources to trust and when to trust them is the key ability that COVID/Climate-deniers lack, and that anybody who desires to be rational should cultivate.
you need to embrace being stupid. Just admit that you are a huge dumbass.
I think the concept of rational thinking is good, but what it’s morphed into in that subculture is being a terrible human being while trying to sound like Mr. Spock.
I think most of the problem is that people have no idea what rational thinking and critical thinking actually are, so they’re easily impressed with Spock using big words. They think that simply identifying the canonical logical fallacies is an I win button for debates. It doesn’t work, and worse makes people believe a guy who argues that way no matter what he’s actually saying.
I was freaking out about philosophies of truth (correspondence etc.) & my wife said something I think is pretty insightful: “No, you don’t get to stop worrying about being wrong until you die.”
There’s no standard or method that can eliminate the possibility of being disastrously, hilariously, despicably wrong. There is no royal road to understanding the “real” shape of the world, no guarantee that what you know will remain true, & no end to the work of integrating new information to figure out what the hell is really going on.
I think the LW “rationality” misstep is assuming there’s some set of principles or framework of understanding that can put someone in a different boat. Nobody isn’t trying to be less wrong, to be responsive to evidence, to value truth over their own prejudice - pretending these values are unique to some subculture is just vanity. We’re all trying to be more rational our whole lives, & reading The Sequences™ is not a prerequisite to that endeavor.
In my opinion, yes. I’m not saying we shouldn’t want to continuously improve how we form our beliefs and how we pursue our goals, but I think focusing on “rationality” the way modern rationalist communities have gone about it, with all the naive philosophical baggage sneaked into it, is broken.
EDIT: Again in my opinion, the alternative to such rationalism is less reliance on ahistorical “calculations” and more import placed on historical/contextual erudition, pluralist critical thinking, social pragmatism, and a lot more epistemic humility.
I think it’s important to accept that the world is such a huge, complicated, and ancient place that you will never grasp all of it with rationality. But you have other faculties as well. Emotions, intuitions - these are not barriers to rational thought but complements to it. The other thing is that rationality may well be domain-limited in important ways, meaning that rational abilities in one subject don’t always carry over. If you want to think rationally about some subject, educate yourself on it while trying to maintain an open mind. If you want to think more rationally in general, I’d suggest you’re better off trying to become wiser, more well-rounded, kinder, and more moral rather than more rational. I figure you want to make truly informed decisions because you want to be a better person in some way - sometimes that’s more an art than a science. The first step is always to get right with yourself and accept your limitations.
I think it depends on what you apply the concept of “be more rational” to.
Personally, I think that the “irrational” part of us provides extremely important insight and additions to our lives. And that in order to be a happy, healthy, functioning member of a community, we 100% need emotion to be a major part of our perspectives and interactions.
On the other hand, if you’re trying to figure out cold fusion, by all means, strive for the coldest, most brutal rationality in all aspects of that search.
Then again, I know almost nothing about the STEM world. So who knows, maybe science and tech folks get great inspiration from emotional experiences and irrational thinking? What do I know?
Elizabeth Sandifer noted in Neoreaction a Basilisk that the halting problem implies that it is literally impossible to know whether you’re onto an idea that will work out well … or just chasing down a rabbit hole that will waste your time forever.
[deleted]
Okay let me help you out with something here, in order for any of what you are asking to make sense in the first place.
You need to start with goals, chief. Your personal goals.
And the goal can’t be “be more rational in how you act”. It has to be something like what you want to accomplish in your life. Do you have a specific job you hope to get? Are you wanting to write a novel one day? Is your ambition to get married and have kids? Once we have that specified (and you don’t have to tell me what your goal is, or goals are) then we can make sense of how you can be more rational. Because then there are more or less rational ways to pursue the goals.
Now you might want to say “maybe my goals aren’t rational?!” You can figure that out too. Maybe you want to be an engineer because you think it will make you happy. Then you might want to figure out if that’s true, because if not, then it’s not very rational to pursue that goal.
But just pursuing increased rationality in the abstract isn’t helpful to you. Rationality is an instrumental good. The real value is the thing you are going to use it to get. So what is that? If you know, then you just want to figure out how well you can achieve your goals with how rational you already are, and how much you already know, and “optimize” your rationality later if needed.
It’s a fallacy to think (greater) education or intelligence confers rationality. Regardless, the cesspit of online rationalism has repeatedly outed itself as “my feels” but spelled out in needlessly strained, overelaborate jargon 99.9% of the time. You might as well label yourself a scientologist.