Think of it this way: imagine one of the men you most greatly admire in life receives large amounts of money in donations but has minimal responsibility, and has a constant rotating stable of fresh-faced idealistic college grads willing to do free labor and also fuck.
Wouldn't anything that involves more real-life responsibility be an existential risk to that lifestyle?
The definition of risk as including both good and bad things is present in some project management academic papers. Although to be fair I never knew project management academia was a thing until I accidentally signed up to study it.
like "fair and balanced" on Fox, this is to be understood not in the most ordinary sense, but in contrast to the hated Other, whose attempts at actually being "fair and balanced" and "unbiased" are, in the special universe occupied by Yud and Hanson and Tucker C., \*actually\* biased and balanced. easy pz
Snarking about the implications of worrying about something that will
happen half a billion years from now aside, I really can’t get past the
fact that this is:
Trying to make a model from one data point
Presuming (as is bloody typical) that colonizing other planets is as
easy as puttering around and landing there, and that there won’t be
overwhelming engineering, logistic, biochemical, and other barriers that
prevent an alien civilization from ever extending beyond its solar
system, much less colonizing the whole galaxy.
>we find a substantial ex ante probability of there being no other intelligent life in our observable universe
This is one of my favorite ideas just because it would crush so many sci-fi fantasies.
A bountiful universe all for Humanity's shaping has its own sort of romanticism to it. The aliens can always come from within; plenty of people already want to be anthropomorphic animals and whatnot.
We should instead guess that eventually the universe will be mostly
filled with civs, and thus one of the key constraints on the origin of
any one civ is a need to pass a local great filter, going from no life
to simple life to complex life to intelligence, etc., before
some other civ arrives to colonize that area, and prevent new pics
there.
So what I’m hearing is we can safely postpone this discussion as irrelevant for the next few million years.
From the comments
English isn’t my first language, but this seems like a very weird definition of ‘at risk’.
The virgin Drake equation
The Chad voluntary extinction
How many less stupid terms could he have chosen to describe his Space Americans? Expansionist? Imperial? Colonial?
Nah, gonna go with fucking “grabby” lmao. Top notch rationalist term-smithing right there.
Snarking about the implications of worrying about something that will happen half a billion years from now aside, I really can’t get past the fact that this is:
These sorts of things are basically mysticism/gut hunches at this pt
If you want to read something actually insightful about the Fermi Paradox instead of whatever this is, try this https://arxiv.org/abs/1806.02404
Why should we guess this?