Many business leaders are disagreeable people who do grey things.
Uber’s activities were deliberately illegal in many countries and I
probably on balance support that.
ethical unethics
I know that my messing with prediction markets around this hasn’t
always gone well (sorry)
cos i know the first thing I think of when I hear about unethical
conduct is to set up a prediction market on when the details leak
Geoffrey Miller’s comment is as usual greatly enhanced by being
written by Geoffrey Miller. He blames this article on the “psychology of
comeuppance” and not on, say, EAs doing grossly unethical and damaging
shit.
What’s so bad about the “psychology of comeuppance”? it:
amplifies moderately bad moral errors into looking like they’re
super-bad moral errors.
because that’s the important distinction here, right
titotal points out the fucking obvious:
We are in the charity business. Donors expect high standards when it
comes to their giving, and bad reputations directly translate into
dollars. And remember, we want new donors, not just to keep the old
ones. I simply don’t see how “we have high standards, except when it
comes to facilitating billion dollar frauds” can hold up to scrutiny.
I’m not sure we can “credibly convince people” if we keep the current
leadership in place. The monetary cost could be substantial.
the trouble with trying to recruit competent people:
The most “ethical” (like professional-ethics, personal integrity, not
“actually creates the most good consequences) people are probably doing
some cached thing like”non-corrupt official” or “religious leader” or
“activist”.
The most “bright” (like raw intelligence/cleverness/working-memory)
people are probably doing some typical thing like “quantum physicist” or
“galaxy-brained mathematician”.
The most “epistemically rigorous” people are writing blog posts,
which may or may not even make enough money for them to do that
full-time. If they’re not already part of the broader “community”
(including forecasters and I guess some real-money traders), they might
be an analyst tucked away in government or academia.
Half of them are working in quant finance and want to be angel investors.
All of the podcasts they listen to are these libertarian behavioral economics guys.
It's the investor mindset turned up to max.
>you sure? I had 'em pegged as front-end JS developers
Lol.
I can't tell where the joke ends and the real question starts tbh.
I'm exaggerating. But
People like Holden Karnofsky worked in investment before starting GiveWell
Bankman Fried's father teaches tax law.
80.000 hours recommends going into quant finance.
I know of some who went into quant finance.
There's a big overlap, using large data models to make predictions.
There's a big culture of investment podcast that they like.
Even the coders and biology researchers are recommended to invest in a couple of stocks.
Effective Altruism is essentially quant finance applied to charity.
Ah, but you can buy your staking tokens with crypto as well -- you need to cash out for fiat first to buy the hardware for proof of work. (And this is a bad thing officially because that means you can't "unbank yourself from fiat", and unofficially since the off-ramps are disappearing.)
But why though? If these people think they're rational masters of predicting complex dynamic systems, I'd have thought surely they'd be all over making money off of "irrational" sports fans
You have a higher opinion of their coding/security skills than I have. But im not going to discuss stuff like that here, posting what you replied to was already a bit in bad taste. (And easily gets misrepresented as us trying to do a concentrated attack at lw instead of an idle musing by me).
> Amplifies moderately bad moral errors into looking like they're super-bad moral errors.
Wait so the allowing of sexual abuse and billions of dollars of loss are only moderately bad moral errors?
Somebody is trying to outdo the USSR in a red flag competition again.
ah yes, geoffrey "assumes facts not in evidence" miller :
>any 'moral activists' who promote **new and higher moral standards (such as the EA movement)** can make ordinary folks (including journalists) feel uncomfortable, resentful, and inadequate.
As usual, I love these dingdongs' false binary between ethics/activism and cleverness or intelligence. It's revealing and a great way to keep the root injustices of power structures out of your public analysis! If only those silly idealists were a little more intellectually meritorious, they might be worth listening to.
I’m actually pleasantly surprised at how many people are taking this
article seriously and at face value. Hand wringing aside, it seems like
a lot of them are talking about how this happened and not trying to
justify it, which is a positive step.
Yeah. I wonder if another chunk of people are going to leave the cult again, or if the denominations are going to split into "ethical EA" vs "long-term EA" or something.
edit: [shoutout to this guy in the comments](https://forum.effectivealtruism.org/posts/b83Zkz4amoaQC5Hpd/time-article-discussion-effective-altruist-leaders-were?commentId=iMK5esSwew4jgXnjo):
> I want to feel like I can trust the leaders of this community are playing by a set of agreed rules. Eg I want to hear from them. And half of me trusts them and half feels I should take an outside view that leaders often seek to protect their own power. The disagreement between these parts causes hurt and frustration.
> I also variously feel hurt, sad, afraid, compromised, betrayed. I feel ugly that I talk so much about my feelings too. It feels kind of obscene. I feel sad that saying negative things, especially about Will. I sense he's worked really hard. I feel ungrateful and snide.
(Oof. The cult leader is not your friend, dude)
> This article moves me a bit on a number of important things:
> We have some more colour around the specific warnings that were given
> It becomes much more likely that MacAskill backed Bankman-Fried in the aftermath of the the early Alameda disagreements which was ex-ante, dubious and ex-post disasterous. The comment about threatening Mac Auley is very concerning.
> I update a bit that Sam used this support as cover
> I sense that people ought to take the accusations of inappropriate sexual relationships more seriously to be consistent, though I personally I am uncertain cos we don't have much information
> I still don't understand why they can't give a clear promise of when they will talk and that the lack of this makes me trust them less
> My gut says that Naia Bouscal is telling the truth, since before I knew her in relation to this, I thought she was a pretty straight shooting twitter account.
> I cannot deny that I am tempted to mediate my comments so that people will like me and probably do a bit
[BITE model](https://freedomofmind.com/cult-mind-control/bite-model/)
It's a cult, get out
It is absolute fascinating seeing all the people in those
comments who just take it for granted that behaving unethically
is par for the course for Silicon Valley.
the comments here are very special
ethical unethics
cos i know the first thing I think of when I hear about unethical conduct is to set up a prediction market on when the details leak
Geoffrey Miller’s comment is as usual greatly enhanced by being written by Geoffrey Miller. He blames this article on the “psychology of comeuppance” and not on, say, EAs doing grossly unethical and damaging shit.
What’s so bad about the “psychology of comeuppance”? it:
because that’s the important distinction here, right
titotal points out the fucking obvious:
the trouble with trying to recruit competent people:
one of these things is not like the others
I’m actually pleasantly surprised at how many people are taking this article seriously and at face value. Hand wringing aside, it seems like a lot of them are talking about how this happened and not trying to justify it, which is a positive step.
It is absolute fascinating seeing all the people in those comments who just take it for granted that behaving unethically is par for the course for Silicon Valley.