Recently, I’ve been considering how I form my opinions on certain topics, and I kind of made the depressing observation that I don’t really have a method to verify the “truth” of many things I read online. I’ve been reading blogs in the rationalist community for a while, and while certain things have pushed me in the wrong direction, I’ve never really been able to “disprove” any of their opinions, so my perspective is always changing. People frequently criticize Yudkowski or Scott Alexander for their errors in judgment or bring up Yud’s gaffes on Twitter, but most people can be made to look foolish by pointing out their superficial errors without challenging their fundamental ideas.
I’m a young man without academic training in political or social sciences. I’ve read books by Chomsky, Rawl, Nozick, Graber, Fisher, Marx, Kropotkin, Foucault, Nietzsche, and other authors (I know this is a pretty random list because they all focus on different things) in an effort to find the truth or a better understanding of the world, but the more I read, the less I was sure of what I even believed in. I frequently believe that I become pretty attached to ideas as soon as someone can persuade me with good reasons or a worldview that I find logical and compelling. I feel like I’m slipping into another meme by “fake” internet peer pressure while scrolling SneerClub because I can’t genuinely prove that LW, SSC, and other ideas are absurd. Without an anchor or system of truths to fall back on, I feel like I’m not really learning much from this experience and am therefore vulnerable to new ideas that sound compelling.
Although I am aware that this is primarily a satirical sub, I was wondering if anyone else has had a similar experience.
Have you tried taking a break from the internet?
This is fine and normal.
This is dangerous! It’s great to read tons of stuff, but for the love of God don’t become one of those people who always believes the last thing they read.
Yeah, that sounds like what’s happening.I think while you’re sorting things out, you have to learn to be comfortable with uncertainty and doubt and you must be skeptical of grandiose claims to truth.
While we constantly mock Siskind and Yudkowsky, it’s because of fundamental, deep problems with their entire worldview. It’d be one thing if they were just some boors at a dinner party: if you remove Siskind’s neoreactionary weirdness (I’m sure something would be left), I’m sure he’d be an interesting person to have at the table (Yudkowsky is irredeemable). Everybody’s a boor with inconsistent and weird views, just don’t get too overconfident about it. One of the reasons Yudkowsky popped onto badphilosophy’s radar is that he burst onto the scene saying, “All the philosophers are bullshit, here are all the right answers, they’re obviously true. All you need is rationality! Also you must engage with us politely and carefully take our arguments in the most charitable light, unlike our casual dismissals of everybody before us.” Good grief.
this is almost certainly something Sneerclub is not at all set up to help you with. I can only recommend avoiding abyss-diving, and this sub is all about the abyss-diving.
You simply should not have a strongly-held belief about something that has not been proven, other than something along the lines of “if there’s an expert consensus, it’s probably worth acting like it’s true.”
The big flaw of the rationalists, and of intellectuals and pundits everywhere, is hubris. I don’t mock Yudkowsky because I know for a fact he’s wrong, I mock him because he’s insanely overconfident about an idea that is currently science fiction and because he is hilariously arrogant in general. (Also, I’m not an expert per se, but I am professionally conversant in AI at least.)
Could he be right? I mean anything’s possible. That doesn’t mean he’s currently right to believe it though! If I go around screaming that there will be a nuclear war next year without any good reason then even if I turn out to be right, it doesn’t mean you should have believed me.
Do you spend any time building or creating things? Or any kind of real world problem solving? It sounds like you’re really well read and that’s great, but there’s definitely this solipsistic spiral that smart people can get into on the internet where it’s endless abstraction and reasoning divorced from reality.
There’s something about having to test your ideas and assumptions against the cold hard reality that I think gives you instincts for BS. Some hobby or craft that requires you to understand and make solutions that aren’t open to interpretation of others.
The other thing I try to look for is how hard is someone trying to convince me of an overall narrative? Does everything they say ultimately lead back to the same (possibly self serving) narrative? Or do they build understanding and connection?
Also, lastly, for Yud and company, it might be helpful to read up on the following topics: - The limitations of probability theory - The incompleteness theorem - Non-deterministic phenomena - Evolution (specifically about how environment shapes evolution and about intelligence as an evolutionary strategy)
Edit: oh and of course, read up on apocalyptic cults
Having grown up with the internet, I’ve just always been skeptical of the idea that the internet is a sufficient forum for serious, truth-oriented work (srs bzns, as we use to meme), which is an idea that pops up again and again, well before the rationalist community was a thing. I just don’t think it’s a medium, including and especially blogs, for that kind of work. Probably some Marshall McLuhan sort of thing to say about it. It’s more like pamphleteering than anything else, and that’s not due to the rhetoric or number of words or whatever but the medium itself. One doesn’t disprove a pamphlet, one throws it in the trash, unless you want to believe the argument it presents, and then you read it and share it, etc.
In this way, I’ve just never had the problem you discussed.
Yeah, you also have to think about stuff yourself.
Lotta other people in here giving better advice than I can, so I’ll just offer some supplemental, backup advice?
Read a bunch of history. Just plain old, boring history- especially if it’s the kind that’s not got any grand thesis about history. No big idea stuff, just messy people doing messy things. Not a solution for you, but it gives a lot of useful perspective.
The solution is to read Marx some more and forget all that other stuff.
I would suggest focusing less on ideas and a priori reasoning and more on knowledge. The path to truth is not finding the right axioms (whether one calls them priors or not) but curiosity about the world around you and humility about the limits of your knowledge. Studying history is great for this.
Mind you part of humility is also figuring out when people genuinely do know more than you. Put another way, figuring out who you should trust. The idea that we can by proper use of elementary critical thinking distinguish between truth and falsehood is a fantasy - for most subjects we need to consult others who know more than us. For instance, I do not know enough calculus to actually understand physics after Maxwell, but I have tried to develop a sense for what real experts in the field sound like. Think of yourself as part of a network of knowledge, of knowledge as a shared, social good, not an individual who needs to prove everything yourself to possess individual knowledge.
And how do you figure out who to trust? At least part of it comes back to looking for people who actually know things. If someone claims to work around actually needing knowledge by having One Weird Trick then they are almost certainly a crank. This applies to Yudkowski but it also applies to most other internet ideological weirdos, and it applies to grandiose claims about the secret to all human history or knowledge and to specific claims about individual facts. When a holocaust denier claims that because of x y and a calculations the crematoria of Auschwitz could not have burned enough bodies, the way to deal with this as a non expert is not to check their math but to ask yourself if it is more likely that someone is fudging the numbers, or that all the evidence we have for the reality of Nazi mass murder is some kind of hallucination. This will allow you to ignore cranks.
One thing to watch out for, IMO, is people claiming that they are protecting you.
The setup is that there is something to fear. It isn’t hard to convince someone that something is worth fearing. You can make people fear swimming pools. You can make them fear electricity. You can make them fear airplanes. You can make them fear religions. You can make them fear irrationality. You can make them fear AI.
The follow up is a strict methodology to protect you from that fear. Ideally you claim that you are the only one trying to protect them from that fear and that you are the only one able to protect them from that fear.
This is a good pattern for a cult. Do you fear a painful afterlife? Follow my rules to please God and gain sufficient favor to avoid damnation.
Do you fear irrationality? Do you feel like you are a rational person oppressed by irrational people who seem to have control over some aspects of your life? There is a group of people who will stoke that fear and offer you a methodology that assures you that you are special, that you are better than those mean people. They will tell you that their method is the only way to avoid your fears.
But it isn’t. It is a cult. Become aware of the tactics.
And join my cult instead. You should be afraid of cults and I will give you the only methodology you need to avoid them. It is the only methodology guaranteed to work.
Most stuff in life isn’t amenable to ‘proving’ definitively one way or another. But obviously, we try to figure out what is true, and what to expect in life. These internet rabbit holes work partly through peer pressure effects caused by segregation into communities that share a viewpoint, and partly through the fact that ruminating on something for a long time will make it seem more plausible whether it is or not (can’t remember the name, but I’m pretty sure this is a cognitive bias with a name…).
For dealing with the community creating a false emotional sense of an ideas plausibility or universality, obviously seeking out opposing and critical viewpoints can help.
A helpful question can be: What would I do differently if X were true, and how crazy/endangered would that make me if X isn’t true? If you wouldn’t do anything differently, then it’s all still speculation, and you should probably try to keep that in mind. If the answer is ever significant and crazy/dangerous, then you need to be serious about examining how much your personal experience backs it up and examining the strength of the evidence. Or finding a better approach that will hold up in both realities.
Someone in another comment mentioned (half jokingly) The Church of the Subgenius, which is rad. I will mention my favorite joke religion, Discordianism. I actually had a printout of this (https://www.cs.cmu.edu/~tilt/principia/) as my allotted religious text in Basic Training in the Air Force. On a more serious note, I’ve found Buddhism helpful as well. Attachment to views/opinions is seen as a hindrance or danger there.
Here’s my newbie take. “Truth” in the sense implied by your notes here is not a social science thing. It’s not even a hard-science thing. You can’t “prove theories” like you seem to wish outside of pure mathematics, which is - and I can’t stress it enough - NOT about any real-world problems.
You can’t rigorously prove, in that mathematical sense, that hate is socially corrosive. You can know, historically, that some patterns, behaviors, statements, and symbols have been used to hurt people. Likewise, you know something is socially good when it heals people and improves their lives, all told.
It might take generations to see some consequences. More likely than not, the consequences will remain mixed between good and bad, forever. None of that social complexity, none whatsoever, is about rigorous proofs of formal statements. If you want clarity and rigor and proofs, pure math is your escapism friend! It doesn’t (I have to warn again) give any clarity about the actual society. It’s just neat :-)
The way you become less vulnerable is by developing a sense of intelligent skepticism. Practice thinking of reasonable counterfactuals to counterbalance your tendency to fall prey to new ideas. Compare the pros and cons in your head, and pick the winner based on the merits.
You don’t have to prove that their ideas are absurd. The burden is on them to convince you. If you are unconvinced, you don’t have to accept just because you can’t articulate why it’s not convincing. Of course, it would be a fallacy to argue “I don’t find this argument convincing, so YOU (the audience of this sentence) must reject its conclusions” but it is perfectly fine to say “I don’t find this argument convincing, so I am not convinced!”
Similarly, if someone is arguing in bad faith, using a ton of jargon, or just generally writing so poorly that you can barely get through a paragraph of their prose, it is ok to simply not engage.
That sounds like a research issue? The basic method is that you pick a specific narrow topic then start tracing the citations back to the original evidence then see whether the big ideas book summarizes the evidence correctly, whether the evidence is strong, and what conclusions other experts have drawn from the same evidence. Once you know the most important evidence, the history of research, and which methods have worked well and poorly on this specific topic, you will start to feel like you have some ground under your feet. Two university courses (eg. a survey of 19th century European history, and a seminar on Marxist and liberal theories on the health and wealth of the English working class where you look at some key evidence and a variety of arguments) will show you the basics. But this method requires you to invest time and effort and experience in a specific narrow topic, so its not popular with the LessWrong folks who want to be experts on anything if they have enough “smarts.”
On most of the questions that come up in life, even PhDs and leading practicioners have to use informal methods like “sounds sketchy” or “I will trust recognized experts over a smooth talker on TV.” They just don’t have time skills or ability to become experts on every topic that someone has an opinion on. A young Scott Alexander told readers to practice “epistemic learned helplessness.” If his fans practiced that, they would be more skeptical of what he says.
Bro just have a fucking opinion and be done with it. You’re overthinking it.
A lot of people on this sub are gonna tell you to stop reading all this rationalist stuff. And that would certainly work, but I think there’s a way to read them differently.
Both my parents were religion majors in college, despite the fact that neither of them are particularly religious, at least in any traditional way. And as a kid who was (at the time) loudly atheistic, that was fascinating and confusing to me; here were the two authority figures in my life, who seemed totally reasonable but also believed that religious texts were totally worth reading.
I kind of stay away from r/sneerclub most of the time, because I think it can be pretty similar to r/atheism. It exists as a counter to a certain intellectual worldview, and the worldview that it’s criticizing often is so convinced of its own correctness that it ends up completely up its own ass with absurdities. But because of that, sneerclub (and atheism) frequently offer their targets absolutely no benefit of the doubt, automatically assume the most extreme interpretations, and maintain that their targets have absolutely no ideas of interest whatsoever.
And the latter is just patently untrue. Scott Siskind has some genuinely interesting ideas. So do Steven Pinker and Robin Hanson and even Eliezer Yudkowsky (though the latter writes in such an insufferable way that I often can only stomach the ideas when I read someone else’s summary of them). On the other side of the analogy, the Bible also has some very interesting ideas, as do the Quran and the Bhagavad Gita. What my parents tried to teach me, and I didn’t learn until later, was that it’s very possible to engage with a text, to understand its arguments and entertain them, without necessarily agreeing. And doing so is always worthwhile, and makes you a wiser person.
One of the greatest problems with the rationalist community is that it explicitly tries to discourage that type of thinking, by convincing everyone to eliminate any “contradictions” in their worldviews. And as a former student of math, I get it, I really do. But the world is messy, and small internal contradictions are a part of a healthy and balanced worldview. Despite their claims of openness, the rationalists’ “in for a penny, in for a pound” attitude can specifically pull people away from engaging with different ideas.
So what’s my advice? Be humble. Lots of extremely smart and thoughtful people disagree with each other. If I’m so confident about the future of AI, do I really think I know more than Timnit Gebru, Sam Bowman, and Peter Norvig? Am I that confident in the wrongness of either all the atheist intellectuals, or else all the religious intellectuals? What a colossally arrogant belief that would be! So I must relax, and accept that all make good points, and that I can, in fact, live with all of those good points sitting in my head at once despite some of their contradictions.
I struggle with this too. I am getting better. I surround myself with people who agree internet arguments are a bad idea. Talk about it openly. Bring it into the real world with people you trust. Tell people when you successfully extract yourself from one.
I think that I cultivated a personal social value in myself this way. I’m learning to value moving on from these cartoonish quick-sand conversation traps—or at most dropping a link to an expert or research or w/e in case some innocent bystander stumbles upon the thread.
Tbh if it’s available to you at all, I think college would be incredibly good for you.
Well, you’ve read a bit of philosophy, so if you enjoy that kind of thing you could double down. Maybe get back to basics with Plato, taking Socrates seriously as someone to emulate. Or really diving into Nietzsche, who is the most profound moral philosopher of all time precisely because of his furious reaction against his own credulousness.
Most directions you go in philosophy and theory (so long it’s not the direction of blogs or substacks) will bring you into greater complexity, greater ambivalence on ‘profound’ subjects, and will lend you many stylistic tools you might use to cobble together your own sophisticated approach.
You could even try relaxing a bit and become a connoisseur of what could be called intellectual entertainment, the best of which actually demonstrates some of the most profound modes of thinking, in conversational and comedic style. I’m thinking of Robert Anton Wilson (Discordianism was already mentioned in this thread) and Alan Watts. Hilarious, warm, human-oriented thinking.
In the end, you have to confront the reality that we’re led around by much more than some rational process of thought we’re aware of. Your credulousness and tendency to ‘fall’ for whoever last gave you a convincing argument could be easily analyzed on the psychological level. That’s an angle far too expansive to discuss productively here, but it’s one you should keep in mind, especially if you get deeper into philosophy, critical theory, etc.
This sub gathers a pretty diverse group, it seems. But most people here seem united in an intuition that Sneering is the appropriate emotional/intellectual response to certain toxic cultural phenomena. A lack of Sneer doesn’t indicate an elevated objectivity, but a partially disabled mind. Maybe you need to strengthen your Sneer quotient.
the great thing about our society is no matter what ideology your subscribe to, you cant really do anything about it in the real world.
OP here, I’m writing from another alt because I forgot the password of the other one. I appreciate all the helpful advice and responses, it was a pleasure to read them. I believe I will initially take a break from internet politics, also for the sake of mental health, before gradually reevaluating how I approach and deal with information in general. Take care!
I feel as if your discontent is because you are finally coming to realize a big truth, that the world is more complicated than most give it credit. You are right to recognize most of the internet is bias.
I believe you’ll find a lot of comfort in finding honest debates. Meaning freedom to speak open and honestly despite how offended you, or someone else might feel. It is only through this practice can one truly begin understanding the morals that you believe. I’ve recognized anytime someone stoops to silence another, it means they are afraid to engage. I feel this is why I’ve fallen in love with comedy. I like that they can approach any issue and, from a satirical point of view, make fun of something. I’d really recommend Tim Dillon or Bill Burr on stuff like this.
Listen, also these “causes” and facts often ran with online are not held by the majority of ppl, but it’s all presented as if it is. So being skeptical of news and advertisements is healthy. In fact, be constantly wondering about what you are not/isn’t reported is often more important (like democrats not wanting to recognize the popularity of RFK for President).
Watch Russel Brand a few times on YouTube. He’s developed quite a following and if you listen to the media, they claim he’s a right wing nut. But if you actually watch and listen to the content, you’ll find he is one of the most progressive ppl out there. You’ll find all sorts of folks like this who don’t fit the mold your told to believe.
Some of this comes down to plain and simple independent media. I highly encourage you to get your news from podcasts and the like rather than the newspapers or traditional media. These skepticisms I am expressing are more typically present in these news sources than the regular. Plus this can lead you from being gullible on other fronts. I appreciate Breaking Points on YouTube, and more often that not, I find the comical whims of Jimmy Dore’s show (also on YouTube) usually right 2-3 months ahead of the national conversation. (He’s hard to swollow for some).
Take this journey wherever it leads you. Your self realization is what’s important and it seems you are now beginning to think for yourself. Good luck!
Edit: I realize my response was not very academic, but in a round about way I just see you being uncomfortable not knowing how to challenge or say in confidence/argue based upon truths. Well who does? Truths are subjective anymore. Supporting honest debate and learning through engaging other ppl is how you find what you believe it. Books can o ky point you in the direction, but this world is about people.
I think you might benefit from reading Principles by Ray Dalio. Skip his autobiography, go straight to part 2. I don’t agree with everything he’s written but it’s a really interesting mental system to have as inspiration.
You should cultivate strong values, it will help to ground you. You don’t need to think of them as some logically deduced axiom but rather like fine-tuning your moral intuitions, for example, compassion, empathy and equality. This will help filter out a lot of stuff, at least in my experience. Ultimately a lot of what we believe is based on first principles which are really derived from who we are. This is why you can’t debate a Nazi out of Nazism: there is an inhate emotional, irrational hatred there which grounds them into their beliefs.
I would suggest travel as well, but like actual travel not tourism. The whole point of travel is to displace you from the familiar, you can do that just by going to the next town over if u really wanted to.