r/SneerClub archives
newest
bestest
longest
When I see posts like this, I can't help but feel kind of sad. Fuck Eliezer Yudkowsky for kicking off a genre of fiction all but designed to draw in disaffected socially-atypical young people, and recruit them to his robot doomsday cult. (https://i.redd.it/onzorhw7djt51.png)
52

they might have just paid attention in high school when their lit teacher tried to get them to read something other than extruded nerd product, but that would have entailed listening to and crediting something a woman said, an impossibility

Man I joined this subreddit recently after googling what the hell r/themotte was and finding it by accident, and I’m a bit lost. As far as I can glean, Eliezer Yudkowsky is the author of a harry potter fanfiction that seems to take a “scientific/rational” approach to the magic in that universe? And this has been a kickstarter for a genre of fiction that attracts asocial “asshole rational”-type young peeps, which in turn have kind of coalesced into a new wave of pseudo-intellectual people discontent with society who for some reason seem to have a chip on their shoulder against left-politics? And that has led them into completely insane far right politics?

I’m missing a lot of dots to connect here, I’m super out of the loop. Does anyone have any kind of comprehensive list of things to read or maybe even a baseline guide post to understand what’s going in this sub? I can kind of get the general vibe happening and it’s one that aligns with my personal perspective on online right-wing extremism and proselytization, but I’m completely lost on 80% of the posts I’m clicking on because I’m lacking all kinds of context. Halp pls.

r/rational started off as an offshoot of r/hpmor, but they also quite like a particular flavour of power fantasy, stories about progressively getting good at things, litrpgs that "make sense" along a very particular axis, anti-death stories, etc. I don't think there is anything inherently wrong with any of this and I still sometimes dip a toe into the subreddit to find this kind of fiction -- I absolutely adore stories like *Mother of Learning* and *Worth the Candle*, and these are in their wheelhouse. They are mostly unaffected by the r/themotte assholes, but occasionally a redpiller wanders in, someone gets a bit too enamoured with the Cult of Yud, or one of the starslatecodex types comes in to wage the right wing's "culture war", though this last type of post does get shut down by the mods. Part of how I ended up on sneerclub (besides the massive hypocrisy of the author of one of their favorite stories, "Animorphs: The Reckoning", who is unfortunately *also* a major figure in the wider non-fictional 'rational' sphere) was because I frequented r/rational and at one point the culture-war types started a big backlash against someone who was tagging nazi shit or transphobic stories like "With One Ring" as such, and I was just like, yeah, if we can't bring ourselves to to oppose literal nazis, we kind of deserve a sneerclub.
I liked Animorphs the Reckoning, but when I found out it was the Dragon Army guy, that put a really dark spin on some parts of it... like the morphing each other and sharing their most private thoughts... it feels really cultish now.
It's absolutely shocking that someone who attempted to run an authoritarian group house would write authoritarian power fantasy into their space mind control and alien transformation war fanfic, I know.
Well when you say it like that it feels obvious.
hindsight reveals the most lovely of framings
So you're more or less on the money about Eliezer Yudkowsky. Now, here's more about him, that's mostly relevant: He took money from Peter Thiel of Palantir, who may or may not want to become a vampire. This is for his "AI research foundation". However, perhaps his most cited and most well-known publication is the fanfiction "Harry Potter and the Methods of Rationality," and not any of his papers on "Timeless Decision Theory" or whatever he's trying to sell these days. HPMOR, the fanfic, has had some controversy over the years. First was when EY said that he'd update faster if people donated to the Machine Intelligence Research Institute, without mentioning that he owned MIRI. There were a few passages that were fairly racist, but they got edited out after backlash. There's also a history of questionable [reddit](https://www.reddit.com/r/HPMOR/comments/3ikzva/hpmor_reading_companion/cuijhq0/) [comments](https://www.reddit.com/r/HPMOR/comments/2ytvky/has_the_author_read_canon_or_just_the_wiki_sources/cpcw71f/) where EY just cannot accept that his fanfiction isn't perfect. Finally, as the capper to it all, he tried to get a fan work Hugo for his fanfiction. Thankfully, he did not. EY also publishes a blog called LessWrong. Usually these are jargon filled and purport to make you more 'rational' but in actuality they are fairly dense and I think they're meant to make you feel smart after reading them without actually teaching you anything? The ideas here are basically Bayes' Theorem can do anything, Many Worlds Interpretation is correct, and the most important thing you can do with your life is prevent Skynet from happening (so you should donate to MIRI). Together with HPMOR, LW 'Sequences' form the core text of Yudkowskian 'rationality'. Also, HPMOR was explicitly written to recruit people to this ideology. Now, I'm moving more onto hearsay as opposed to statements supported by Yudkowsky quotes, but supposedly Yudkowsky used his reputation and his recruiting and the fear of robot Satan to get people to join MIRI and do unpaid labor for him, and also have sex with him. This has led people to accuse LessWrong of being a cult. I'm focusing primarily on the fandom side of things because that is what I'm familiar with. TLDR: consider lurking more.
[removed]
Sometimes, even now, a single burning question keeps up up at night: Did Scott Aaronson ever reply to tell Yudkowsky he was wrong?!?
[removed]
not universe, many-worlds multiverse.
Oh hey guess who is a Two time Hugo award finalist? Chuck tingle. He wrote a Harry Potter parody. Except unlike big yuds version it has good life and love lessons in it. It’s called.... get ready for this... “Trans Wizard Harriet Porber And The Bad Boy Parasaurolophus: An Adult Romance Novel” Yes. That’s right. And you can get it on amazon too. https://www.amazon.com/Trans-Wizard-Harriet-Porber-Parasaurolophus-ebook/dp/B08B4BJNB3 #293 in books. #4 in fantasy romance. Yes that’s right, this book significantly outsold hp:mor.
To be fair, Chuck Tingle is a professional with an actual career.
The market has spoken. And it has ruled that chuck tingle has more utility than yud. Therefore the utility monster will recycle yud into more chuck tingles.
I want to read this to Rowling.
>Yes that’s right, this book significantly outsold hp:mor. Err - it was never for sale, as that'd be illegal, because it was fan fiction.
So that’s what I thought. Someone is selling “used” copies on amazon though.
You for reported for the “hearsay” thing. Whoever did it said “we can do better” - that obnoxious “we” the collective hive mind of SC again. But I can absolutely vouch, if only second-hand (but not third-hand or fourth-hand: second-hand), for the accusations.
...I feel like there's a hand-wringing rationalist losing his/her vapours over the fact that I was "uncharitable" in that write-up. To them, I say: study humor. I also thought it would be in poor taste to link to a suicide note in a discussion on Harry Potter fanfic. Because even though it does describe the rape culture of Bay-Area Rationalism, it doesn't specifically call out EY.
> consider lurking more. On my way to doing just that because I clearly have research work cut out for me here. This was pretty illuminating though, thank you!
If you want to read more about the connections between neo-reaction, the alt-right, and lesswrong you could read 'Neoreaction a Basilisk: Essays on and Around the Alt-Right' by Elizabeth Sandifer, she has basically written the only longer expose of this phenomenon. Lot of people here like the book. (Personally I found it a little bit hard to read with the literary allusions, but im a gamer). Also small warning, lot of people read the book and then their takeaways are that Moldbug, Land and Yud are all alt-right, which they aren't imho, the first two are neo-reactionaries, and Yud is ... well himself. E: The connection between neoreactionaries (also called NRx) and lesswrong/slatestar/themotte is also weird and complicated. As they don't really like each other that much (but they do all often interact, and while a lot of NRx speak was banned (often for short periods) on lw/ssc in the past, recently this all seems to be more accepted (either because these communities shifted towards being NRx, the NRx is [hiding their power level](https://www.urbandictionary.com/define.php?term=Hide%20Your%20Power%20Level) better, or because there simply aren't that many NRx people around anymore (or they moved)) E2: also don't forget that even we here can be wrong, we obviously are biassed to see bad actors in the broader rationalist sphere.
As a gamer myself I think my skinner box-ruined brain would probably have some difficulty getting into it too, but I'll give it a try. Thanks for the rec
[deleted]
thankfully he didn't write it into his fanfic, which is why I didn't include it
I would suggest browsing the relevant people and organizations on RationalWiki.
Any particularly good starting point or should I just go for scott alexander and read from there?
Him and Eliezer for sure are good starting points!
Cheers, looks like I got a fun couple days of night reading ahead of me.
[https://web.archive.org/web/20010204095400/http://sysopmind.com/beyond.html](https://web.archive.org/web/20010204095400/http://sysopmind.com/beyond.html) [https://web.archive.org/web/20010213215810/http://sysopmind.com/sing/plan.html](https://web.archive.org/web/20010213215810/http://sysopmind.com/sing/plan.html) [https://web.archive.org/web/20010606183250/http://sysopmind.com/singularity.html](https://web.archive.org/web/20010606183250/http://sysopmind.com/singularity.html) [http://web.archive.org/web/20101227203946/http://www.acceleratingfuture.com/wiki/So\_You\_Want\_To\_Be\_A\_Seed\_AI\_Programmer](http://web.archive.org/web/20101227203946/http://www.acceleratingfuture.com/wiki/So_You_Want_To_Be_A_Seed_AI_Programmer) [https://web.archive.org/web/20010309014808/http://sysopmind.com/eliezer.html](https://web.archive.org/web/20010309014808/http://sysopmind.com/eliezer.html) [https://web.archive.org/web/20010202171200/http://sysopmind.com/algernon.html](https://web.archive.org/web/20010202171200/http://sysopmind.com/algernon.html)
Eliezer Yudkowsky was a self-described "child prodigy" who was heavily influenced by time spent, in the late '90s and early '00s, on a transhumanist mailing list called "extropy". (Transhumanism is, broadly speaking, the aim of using technology to improve humanity by such means as, for example, turning everyone into cyborgs or developing life extension technology. There's a Wikipedia article.) He became a prominent personality on the extropy list and eventually wrote a large document about creating "friendly AI", a superhumanly intelligent machine that is compatible with human life i.e. will not just kill or enslave everyone in order to fulfill its own alien, inhuman goals. He founded the "Singularity Institute for Artificial Intelligence", an organisation devoted to ensuring that, when AI is eventually created, it is "friendly". He wrote a lot of wacky stuff around this time. A lot of this writing he has now disavowed, but he is still big on the "friendly AI" idea. [Here's a link to one page from around 2000](http://web.archive.org/web/20011123134822/http://sysopmind.com/tmol-faq/miscellaneous.html). (The Singularity Institute has since been renamed to the Machine Intelligence Research Institute. "Singularity" refers to the science-fictional concept of developing a smarter-than-human AI, which can then develop an AI smarter than itself, and so on, with this rapid increase in intelligence causing an enormous increase in the general level of technology.) He joined a blog called "Overcoming Bias", where his co-author was economist Robin Hanson. Hanson is, in general, a deeply weird guy who tends to have pretty oddball libertarian beliefs. Yudkowsky's posts on the blog tended to be about things like teaching yourself to be smarter (by, say, *overcoming* cognitive *biases*). Eventually, Yudkowsky started his own blog called "Less Wrong" where he shifted to more of a focus on his idiosyncratic views on robots and quantum mechanics. He wrote a series of very lengthy posts called "the Sequences" where he goes into his beliefs in great detail. Less Wrong built up a pretty substantial community, who came to call themselves "rationalists". It was also a group blog that ran on the reddit software, so anyone could make an account, make a post, and the post would be voted on by other users. The heyday of Less Wrong was maybe some time around 2008-2012ish? It was over this period when Yudkowsky was writing his Harry Potter fanfic in an effort to promote and publicise his views. One very popular user on Less Wrong was Scott Alexander, who wrote many posts for the blog. He eventually started his own blog, called Slate Star Codex. Less Wrong eventually died down and Yudkowsky stopped making new posts, while Slate Star Codex grew more popular. The comment section there developed a reputation in certain circles as being a bit... soft (to put it charitably) on certain forms of right wing extremism related to novel variants of neo-fascism and scientific racism. (Scott Alexander has denied that the comment section was generally right-wing. By his self-descriptions, he is a libertarian-minded Clinton/Biden voter. If the topic of race and racism comes up, he will vaguely hint that he believes the scientific establishment is suppressing the truth about connections between race and intelligence.) /r/slatestarcodex is a subreddit for fans of the blog. /r/TheMotte was created to move "culture war" topics out of the main subreddit. /r/SneerClub is named after a Yudkowsky comment about people whose main mode of engagement is "sneering" at things. He was talking about a forum that didn't like his Harry Potter fanfic. Quote is in the sidebar. ----------- Here are some interesting quotes from Eliezer Yudkowsky (he does not necessarily stand by all of them): > I think my efforts could spell the difference between life and death for most of humanity, or even the difference between a Singularity and a lifeless, sterilized planet […] I think that I can save the world, not just because I'm the one who happens to be making the effort, but because I'm the only one who can make the effort. # > Striving toward total rationality and total altruism comes easily to me. […] I'll try not to be an arrogant bastard, but I'm definitely arrogant. I'm incredibly brilliant and yes, I'm proud of it, and what's more, I enjoy showing off and bragging about it. I don't know if that's who I aspire to be, but it's surely who I am. I don't demand that everyone acknowledge my incredible brilliance, but I'm not going to cut against the grain of my nature, either. The next time someone incredulously asks, "You think you're so smart, huh?" I'm going to answer, "*Hell* yes, and I am pursuing a task appropriate to my talents." If anyone thinks that a Friendly AI can be created by a moderately bright researcher, they have rocks in their head. This is a job for what I can only call Eliezer-class intelligence. # > If you don't sign up your kids for cryonics then you are a lousy parent. # > I am tempted to say that a doctorate in AI would be negatively useful, but I am not one to hold someone's reckless youth against them – just because you acquired a doctorate in AI doesn't mean you should be permanently disqualified. # > If you haven't read through the MWI sequence, read it. Then try to talk with your smart friends about it. You will soon learn that your smart friends and favorite SF writers are not remotely close to the rationality standards of Less Wrong, and you will no longer think it anywhere near as plausible that their differing opinion is because they know some incredible secret knowledge you don't. # > I must warn my reader that my first allegiance is to the Singularity, not humanity. I don't know what the Singularity will do with us. I don't know whether Singularities upgrade mortal races, or disassemble us for spare atoms. While possible, I will balance the interests of mortality and Singularity. But if it comes down to Us or Them, I'm with Them. You have been warned. # > I would be asking for more people to make as much money as possible if they're the sorts of people who can make a lot of money and can donate a substantial amount fraction, never mind all the minimal living expenses, to the Singularity Institute. > This is crunch time. This is crunch time for the entire human species. […] and it's crunch time not just for us, it's crunch time for the intergalactic civilization whose existence depends on us. I think that if you're actually just going to sort of confront it, rationally, full-on, then you can't really justify trading off any part of that intergalactic civilization for any intrinsic thing that you could get nowadays […] > […] having seen that intergalactic civilization depends on us, in one sense, all you can really do is try not to think about that, and in another sense though, if you spend your whole life creating art to inspire people to fight global warming, you're taking that 'forgetting about intergalactic civilization' thing much too far. # > Find whatever you're best at; if that thing that you're best at is inventing new math of artificial intelligence, then come work for the Singularity Institute. If the thing that you're best at is investment banking, then work for Wall Street and transfer as much money as your mind and will permit to the Singularity institute where [it] will be used by other people.

The post was honestly an earnest and heartfelt attempt by a huge HPMOR fan to seek works that would help them develop empathy. The posters, for the most part, were deeply caring and helpful, if occasionally misguided.

But HPMOR kicked open the doors for a whole bunch of mediocre power fantasy masquerading as profound, while also glomming onto stories like Worm and trying to fit them all into a certain box. And after some time, I suspect that box is to take a setting and removing inherent wonder of the premise, to replace it with a poor imitation of Carl Sagan’s wonder for outer space– or, if you’re Eliezer and you’re writing porn, economics fetishism (so I’m told.)

tangentially, it confused me that Helen DeWitt wasn’t more popular among the self-styled “rationalists” until I realized

  1. she’s a woman

  2. who doesn’t really write about submissive feminine sexuality (the way these dudes lose their minds over Jacqueline Carey is, by contrast, intensely cringe-inducing)

  3. and whose work presupposes some knowledge of or curiosity about wider culture that isn’t, like, anime weeb shit

  4. and also presupposes some desire on the part of the reader to be moral in addition to being smart

  5. finally, she isn’t wordy enough and doesn’t use enough unnecessary science-jargon for that audience

Ooh, thank you for the recommendation lmao
I really love *The Last Samurai* by DeWitt and reread it every year or so. She's very good at depicting the irrationality of human beings (her protagonists/narrators not excepted) without straying into "and then Rationalism Won and everyone clapped!!!1!!"

I really liked HPMOR back in the day, some time after I went through an am I a sociopath phase, which was more of a The Last Psychiatrist thing for me. But I think the best thing you can do is actually recommend books there. Maybe Blindsight ?

To be charitable to the AI cult, and ignoring the sexual assaults, it works a bit like Extinction Rebellion in that it’s organized around a fair concern about which they have a blind spot - though it is one that invalidates the whole endeavor, - which is a refusal to consider leftist solutions and thus to realize that the roots of the threats are in capitalism itself.

sometimes I genuinely miss The Last Psychiatrist, who could at least land a joke
A friend of mine and I have often compared him to Scott Alexander, although TLP had a much better ratio of insight to crankery, and TLP was much less... whiney.
That's a pretty big thing to ignore, but-- they were meant to become the resistance against skynet but instead they became nazis. Blindsight's probably a good option. The commenters gave the standard recommendations like Brandon Sanderson. Honestly the most effective suggestion there probably would've been literary fiction. EDIT: let me just rephrase that with qualifiers since someone was justifiably upset about me referring to painting a broad group as 'Nazis.' The people who read HPMOR and the Sequences were supposed to become the resistance against Skynet. However, a small minority, but one large enough that Eliezer Yudkowsky has expressed discomfort that they admire him so much, have adopted views that, if not literally the tenets of National Socialism, are similar enough to make a casual onlooker both deeply uncomfortable with the people and comfortable with calling them Nazis -- such as "Human Biodiversity" or unironic support of "The Bell Curve".
> they were meant to become the resistance against skynet > > but instead they became nazis. fuck yes, I am going to steal and apply this, thank you
One interesting thing I heard today was that the zombie survival genre had a lot to do with spawning the boogalosers. I feel like this is a very similar case of Chekhov's Doomsday Cult. They will find the doomsday wherever they can.
It's not mine, it's from somewhere on this sub I could've sworn you came up with it.
> Honestly the most effective suggestion there probably would've been literary fiction. Uh, I think you mean earthfic
Yo, I think it’s not cool to call a group who has a much higher percentage of Jews than the general population, and a group started by a Jew, “Nazis”. And they seriously *arent*. The in person rationalist community might have some libertarian leanings and discuss some right -of-center stuff but they definitely aren’t Nazis, they contain a LOT of people who had relatives exterminated by ya know, actual Nazis. I’ll make fun of them as much as the next sneerer but come on, Nazis they ain’t.
yes, you're certainly right that the rationalists themselves are not nazis. however, lots of their off-shoots are so right of center that they may as well be Nazis. Such as, for example, neoreactionaries. and I am well aware that Eliezer Yudkowsky is immensely frustrated by the fact that there are basically Nazis in his fanbase.
Neo-reactionaries are not an “off shoot” of the rationalists, they were an independent thing that later formed some overlap on some websites
Fine. They're intertwined online. Deeply, deeply intertwined. Rationalists aren't Nazis. But plenty of people online, whose only complaint about being called 'Nazis' is the historical taint, try to act like rationalists.
You are aware somebody was trying to recruit people for his 'blood and soil' homesteader group in themotte? As in, he aked specific people 'hey you look like you would be a good fit for our private voat/reddit group' (which was a blood and soil homesteader group). I dont get why this was done via a public comment and not via a direct message, but it was done. And the weird part is that this (and the rest of the 1488 stuff) causes no big reactions in the more important people in the community. Now you have people self admitting the culture war thread radicalized them etc. But people act like everything is fine and it is actually sneerclub who is the real bad guys. I get that it hurts for the jewish people, and I'll try to be careful in not calling them nazis, but that doesnt change the fact that neonazis see themotte as a tool for their ends. E: and in regards to the NRx, some of the neoreactionaries are hard at work trying to convert the celebs of rationalism to their side. And with Scott saying more and more things that might be interpreted as being neoreactionary (taking the kolmogorov option into account), and other Scott having personal correspondence with Moldbug, this is all pretty worrying. The principle of charity only extends right in the rationalist sphere.
Most of my exposure to the rationalists is in person, living in the Bay Area and being one, and I’ve never looked at the motte subreddit besides a few things linked here, so I wasn’t aware of that. That sucks. I think views from in person community of rationalists and the online outgrowth of SSC differ by quite a lot. IMO, the in person community is far too *liberal*, not right. For example, Alicorn gender-transitioning her 3 year-old toddler and nobody, including their housemate Scott Alexander, bats an eye about it. On the contrary it’s more like perversely celebrated. If you look for counter examples to your statement on the principal of charity only extending right, you’ll find them. I don’t know why Scott Alexander is so accommodating to neoreactionaries, but I always cringe when people call him a Nazi or inspiring Nazis because he is also Jewish and again, definitely not a Nazi.
I can only judge them on their web presence, and Scott does have a tendency to be liberal but only if the liberal people are nice, see him talking about how gay people would have gotten rights sooner if they weren't aggressive about it (or at least arguing that the agression part didn't help them (which is prob just because Scott himself is so very much into non confrontational conversations)), and how he thinks that Black people don't bad treatment by the cops. I mean you can be liberal on trans issues and still believe IQ is real etc. And well, I think Scott (if he isn't a secret HBD person) just really believes in conversations fixing things and that is why he allowed people with slurs in their names call for the death to all cucks in their imagined fascist state on reddit (this user is now banned btw, but the longgg post about this ideal state was labelled a quality contribution (this person had so much more shit posts btw, it never ended)). I also think he is way way way to busy to actually read what goes on in the CW threads, and just sometimes drops in and goes 'nah, this isn't that bad(*)' and sneerclub is of course a collection of the worst hits of themotte/etc (often also about the same few bad posters). As we say in Dutch, soup isn't eaten as hot as it is served. (and we can just be simply wrong about things) Also, from what I read about SSC on various very racist places on the internet, the racists don't like Scott himself that much, a bit of a case of 'themotte/ssc is a good place to redpill people even if Scott is jewish'. There is also the problem that people say nazi as a catchall for neo-nazis/nazis/fascists/people who have no chin/HBD/Race realists/NRx/people way to into hentai/etc. It is, if you allow me a 'both sides', the same way rightwingers use communist/socialist as a derogatory term for everybody on the left. I try to be precise in this myself, but sometimes me typing things and thinking about other things at the same time fails, and I express myself horribly. *: esp as Scott doesn't like certain types of more ... SJW activism which were called out a lot in the culture war threads. E: I'm sorry I don't have more links to certain events here btw. Looking all this up would take forever, and in some cases (the blood and soil thing earlier) the post are already deleted. (I have already also deleted a lot of my old posts myself because I'm afraid of a backlash after people reacted a bit nutty after the whole doxing scott by the nyt thing).
Alicorn is a terrible human being (and, contrary to your apparent belief, *pretty fucking reactionary*) but she is also not "gender-transitioning her 3 year-old toddler", she is accepting her toddler's own account of their gender, which is actually the appropriate thing to do in that situation.
Ridiculous
I loved hpmor when it started and I thought it was just a comedic deconstruction. Then the author Got Serious and while I still like bits of it, as a whole it has so many flaws.
I’m unconvinced that the project of eliminating one’s biases with the employment of dubious ontological claims about human nature and the scientific image is a “fair concern” on the level of ER

What are books that a) fulfill the request but b) are extremely upsetting to jr hitlers?

I’ll start by suggesting Gideon the Ninth and The Starless Sea.

E: Lord of the Rings and Children of Hurin, too, as great IT’S A TRAP! options. Sam and Frodo’s journey is the opposite of a power fantasy.

They also can’t deal with Becky Chambers, heh. I identify as a spiritual Exodan.

*The Last Samurai* by Helen DeWitt fits IMO. I'm looking at my bookshelves and it occurs to me that most of the "rationalist" types remind me of the narrator of *The Eye* by Nabokov.
...really? i'm reminded of humbert humbert /s
"narcissistic unreliable narrator" is a lot of Nabokov's novels tbf
yeah but never forget that eliezer yudkowsky once unironically pulled the 'askhually it's ephebophilia' line
oh I'm aware, lol the problem with *The Eye* is that it's difficult to discuss without spoiling it, but I do think it's the more apt comparison (not least because Humbert Humbert needs a certain level of social competence to pull off his shit)
gotcha. Haven't read it yet. Will add it to the list.
it's short! 80ish pages
*takes notes*
I also wish I had the free time to sort out my thoughts about the resemblance today's "rationalist" bears to the "superfluous man" of 19th-c. Russian literature
Maybe we should break this discussion out into its own thread, it's going good places.
if my insomnia gets really bad one night maybe I'll write up an nsfw post on the superfluous man thing
do it do it do it
Hi I’d like to sign up for your newsletter
I gotta imagine The Left Hand of Darkness would enrage them on every front, as well as being great. It's more about learning empathy with a truly alien experience than starting with it, I think which might make it even better.