r/SneerClub archives
newest
bestest
longest
Sequences classic of the day: Sparkly Elites. "these CEOs and CTOs and hedge-fund traders, these folk of the mid-level power elite, seemed *happier* and more *alive*." my dude have you never heard of cocaine (https://www.lesswrong.com/posts/CKpByWmsZ8WmpHtYa/competent-elites)
100

Having worked in a financial institution far away from the cocaine & party side of the business, I can promise you that sober day traders would lead you to the exact opposite conclusion about who is full of energy and enjoying life.

we know from FTX that rationalists stick to adderall but at 10x therapeutic dose
I'd also argue that making decent/good money also makes people seem happier and more alive.
Not having to worry if you need to spend this money on food/rent/clothes/replacement of tech/repair of tech etc tends to help yeah.

Related sneer, he speaks approvingly of Paul Graham’s writings. The funny thing is that Paul Graham has desperately tried to hold onto his “I’m a real software hacker, not just a VC” over the years to hilariously bad effect. His writings have trashed his reputation within the community to the point where he no longer gets positive reviews on Hacker News, the forum run by his own VC firm.

Of course Yud is impressed by him.

PG has a few good articles I think, but I haven't read most of his stuff or kept up with anything post several years ago. What's he done? I'm in the mood for some "hello fellow kids" silliness lol
This is one of the most paradigmatically sneerworthy PG takes: http://www.paulgraham.com/conformism.html

An okay sneer from Robin Hanson, of all people: “Yes, people at the top are good at giving other top people the impression they are smart.”

I once saw Eric Drexler present an analogy between biological immune systems and the “active shield” concept in nanotechnology, arguing that just as biological systems managed to stave off invaders without the whole community collapsing, nanotechnological immune systems could do the same. I thought this was a poor analogy, and was going to point out some flaws during the Q&A. But [VC guy] Steve Jurvetson, who was in line before me, proceeded to demolish the argument even more thoroughly. Jurvetson pointed out the evolutionary tradeoff between virulence and transmission that keeps natural viruses in check, talked about how greater interconnectedness led to larger pandemics—it was very nicely done, demolishing the surface analogy by correct reference to deeper biological details.

OK, it shouldn’t take a lot of work to dismiss Drexlerian hype, but … nothing Yud reports Jurvetson as saying actually demolishes the analogy. Jurvetson’s first point is just restating Drexler’s (an “evolutionary tradeoff” is an ecosystem “manag[ing] to stave off invaders without the whole community collapsing”). If you presume that nanotechnological immune systems will be like biological ones, then the evolutionary tradeoffs of the latter will apply to the former. Likewise, whatever makes biological diseases worse will do the same for nanotechnological ones. Saying that the limitations of one kind of system will be the limitations of the other is still using the analogy, not invalidating it.

This is just vagueness meets vagueness, as reported by someone who got more of a stiffy from the second instance.

People still talk about Drexler at all?
Well, Yud was writing (checks post date) the month that Lehman Brothers collapsed, so maybe Drexler was more relevant back then.
oh yes, i forgot how exquisitely timed this article was
He works on AI alignment now.
Oh god WHAT
https://www.alignmentforum.org/users/eric-drexler On the fairly sane side of arguing that a safe suite of diverse highly-capable AI services is a more feasible goal than building the benevolent AI god.
He's on his way to being another Teilhard de Chardin: a part of the fundament mostly forgotten in its specifics.

I would be absolutely flabbergasted to read this without knowing big yud like this sub does, but rn it’s just fucking hilarious

This whole essay is a good counterpoint to his sneer about the value of formal writing instruction in that other post.

Like what the fuck are you talking about, Yuddy: “But I am an independent scholar, not much beholden. I should be able to say it out loud if anyone can. I’m talking about this topic… for more than one reason; but it is the truth as I see it, and an important truth which others don’t talk about (in writing?). It is something that led me down wrong pathways when I was young and inexperienced.”

The very first comment quotes Yud saying, "This, I suspect, is one of those truths so horrible that you can't talk about it in public" and then adds > Charles Murray talked about in "The Bell Curve."

Clicking around a bit brought me to this weird remark by yud. In which he basically argues that all trolls want to drive people to suicide, and that is why we cannot ever publish conscious computer systems. It quickly devolves into ‘achtually 4chan doesn’t like animal abusers’ (which iirc is true).

(And while 4chan is often evil, and it should have been turned off when the nazis took over via pol, they are making them out to be some sort of superintelligent evil, while most of the 4chan trolls actions are just troll actions which trick other 4channers, not some sort of hyperdimensionalchess).

Somebody please think of the feelings of the SIMS!

As if his aligned agent sitting on a supercomputer controlled with boxes and probing and memory wipes cajoled into "value alignment" without a will or agency of its own wouldn't be living its own particular hell. The 4channers just want a wifu this dude wants a god. Of course that begs the question: I thought the problem was *all* intelligent agents *immediately* do recursive self improvement so these hell agents would in fact not exist by the very contention.

I was curious to see what this supposed 200 IQ VC had accomplished. From here: https://en.wikipedia.org/wiki/Steve_Jurvetson:

  • Theranos

  • SpaceX

  • Tesla

  • D-Wave

  • Sexual harassment allegations

Checks out

gotta force the EY is comic book guide buffoon meme https://voca.ro/13h0QA5ZFDKI

and have a merry Dickensian christmas y’all.

I love these so, so much

Where exactly is this along the timeline of Yud settling down in a career as a professional bootlicker?

Now I’m picturing 16th century Yud as a fervent alchemy crank who self-publishes pamphlets defending the divine right of kings.

he'd started getting Thielbux and moved SIAI as it was from Atlanta to the Bay Area, so I'm presuming this was from his rich vaguely fashy nerd contact list

I mean you get out of these interactions what you put in, don’t you?