r/SneerClub archives
newest
bestest
longest
you'll be delighted to hear that Max Tegmark has solved geopolitics by pulling numbers from his backside and Bayesing them, then put the result on LessWrong, to the delight of all (https://twitter.com/tegmark/status/1578911288859987968)
73

D, this does not delight, this does not delight at all.

Good to know that the only way to avoid nuclear war is to let nuclear powers do whatever they want, this is definitely valuable insight.

Being against nuclear weapons does not mean you should treat nuclear blackmail as a legitimate tool of statecraft; in fact it means the opposite.
Ah, well, if they're not legitimate.
"DOES THIS NOT SPARK JOY??"
It sparks megatons of joy, rapture, and radioactive solidarity, of course!
Guy who launches the nukes: "YEET"
Good, bad, I’m the one with the YEETCULAR LAUNCH DETECTED NFT
He's not calling to let Russia do whatever they want, he's calling for de-escalation in cases where it doesn't weaken the West's military power. He gives examples of escalation he wouldn't want either side to do. 1) nuclear threats 2) atrocities 3) assassinations lacking military value 4) infrastructure attacks lacking military value (e.g. Nordstream sabotage) 5) shelling the Zaporizhzhya nuclear plant 6) disparaging de-escalation supporters as unpatriotic
Calls for deescalation on day one Russia sends 75 cruise missiles to a civilian population day two, killing 11. Ukraine is reclaiming their rightful land and only going after military targets. How do we deescalate in this environment? Give Russia want they want is the only solution that is clearly not guaranteed. And it is up to the Ukranians fighting for their land to make that choice.
I just provided a list of possible deescalations. In particular, sabotaging nordstream 2, I can’t imagine was a good idea. We don’t know who did it, but I assume it wasn’t Russia, it’d make no sense for them to do it
Russia (under Putin in particular) has a sordid history of attacking their own then using it as a pretext to harm others (Rayzan 1999). Nordstream 2 in particular only hurts Germany but it seeds in the mind of people like you doubts and uncertainities. Regardless. No amount of ceasefires or agreements will work. The Ukrainian people have been far too damaged by this invasion. Lives lost families uprooted. Power vaccums fall into place. Ethnic or cultural lines are drawn. Attacks continue. The only actual peace is for the borders to be returned. For sure, had Keiv simply rolled over and let Russia come in and take over the government a good portion of the deaths would have been reduced (though the rapes, kidnappings, and looting would have been still quite high). But the attack wad repelled and they begged for help. This nuclear hostage taking says "if they have nukes you must not help." Chamberlain is famous for "peace in our time" because WWII broke out a few years later. What people who don't really know about that history is that the 1938 Munich agreement *ceded land to Germany* as a concession. Guess what happened after? Germany took more land. There is no deescalation.
> Nordstream 2 in particular only hurts Germany but it seeds in the mind of people like you doubts and uncertainities. It definitely hurts Russia, they invested billions in building it, and trying to sell gas to Europe was one of their few areas of leverage. It's not impossible it's self-sabotage to try to sow doubts, but I don't think it's likely. >Regardless. No amount of ceasefires or agreements will work. The Ukrainian people have been far too damaged by this invasion. Lives lost families uprooted. Power vaccums fall into place. Ethnic or cultural lines are drawn. Attacks continue. The only actual peace is for the borders to be returned. Which borders? Pre-2014 and return Donbas and Crimea to Ukraine as well? What do you do about the fact that a lot of Russians who want to stay in Russia live there now? They only live there because Russia shipped them there, but that doesn't change the reality that they are there.
Most of the new transplants have only been there 5 years or so many just got there before the war began. They gotta go. Either back to Russia or transplanted somewhere else within Ukraine proper. One things for sure once Crimea is retaken they wont be slaughterd like Ducha or Maripol.
Forced movement of civilians like that is ethnic cleansing, and that's generally not allowed.
Do you think it would be immoral if part of an Isreali peace deal involved expelling Isreali settlers from illegal settlements on the west bank? Technically that would also be "forced movement of civilians". The point is that everyone who moved over recently has deliberately engaged in a deliberate act of settlerism and imperialism. It'd be a different story if they'd been born and grew up there, but it's been hardly any time at all.
Heh, 300k Crimeans were relocated... Tell me more about what is allowed. They just moved in they will go either way.
If and when Russia makes a credible nuclear threat against the US conditional on it continuing to supply Ukraine then I think we should stop, but that doesn't mean Ukrainians have to stop fighting for their land themselves. I do agree that not giving in to threats of terrorism is generally right, as it takes away the incentive for (costly) threats against the same target in the future. But with nukes, there is no future for the target as a continuing entity if it gets nuked. When we repeatedly refuse to give into normal terrorists, they'll keep doing terrorism, eventually realize it's not worth it, and stop, but with nuclear threats, we'll just get nuked (again, assuming the threat is credible). However, I don't think we have a significant nuclear risk against the US yet; basically, I think Tegmark substantially overestimates Putin's willingness to nuke Ukraine if he's about to lose and NATO's willingness to respond to such an attack with a counterattack against Russia.
So bret Deveraux wrote a little article about nuclear deterrence and the theory behind it: https://acoup.blog/2022/03/11/collections-nuclear-deterrence-101/ check out "war under the umbrella" for the salient bit. Basically there is a "We know nukes can lead to mutual annihilation, they know nukes can lead to mutual annihilation, both of us presumably have Red Lines where we will use nukes, but both of us are also trying to pretend our red lines are at a different space than we are." Russia's threats are attempts at projecting their Red Line, but the US doesen't think this attempt is credible. *They might be wrong about this* and we could all die, but it seems relatively reasonable thus far.

The numbers are so completely arbitrary as to have no value, even if the underlying premise is correct (dubious). If “David winning” (presumably a decisive, conventional Ukrainian victory), there’s a 30% chance of nuclear escalation? So Putin would wait to use the nukes until AFTER he’s lost, at which point he’s, what, using them out of spite? Does he twirl his mustache as he says “If I can’t have it, no one can, nyahahaha!” and villain-exists-stage-left?

But okay, sure, let’s imagine that happens, or let’s imagine this person was simply making a bad graph and what they meant was “DANGER OF David winning” leads to 30% chance of nukes. So then there’s an 80% chance of a CONVENTIONAL NATO retaliation - literally saying 80% chance NATO does anything at all, let alone proportional. What the hell? Putin invades the largely Russian portions of Ukraine and the West accepts billions of dollars in debt and a crippling recession in order to stop him, but Putin defiling the nuclear taboo for the first time and there’s a 20% chance that, what, Biden wags his finger and says if Putin crosses another red line… THERE IS NO FURTHER RED LINE TO CROSS BEYOND BREAKING THE NUCLEAR TABOO. There’s a 20% chance NATO is just cowed by Putin’s brazenness and backs down? Nothing here makes sense.

I’d argue strongly that Putin may be misinformed, cruel, selfish and treacherous, but he is well aware that if he uses nukes, he becomes Hostis humani generis. “We need to back down or the madman does a mad thing” stops working if he DOES DO THE BAD THING. Then there’s no further leverage - he is a proven threat to the human race and to the survival of many nations, including Russia. “80%” chance he is assassinated, either by any of the many, many intelligence forces who would like the world to keep spinning, or by his own men who can understand that he’s taken an express train to getting Moscow and St. Pete nuked.

Studying political science, especially international relations, is the worst decision one can make, because you understand that the vast, vast majority of people commenting on foreign affairs have less than zero insight and are following some Conan/He-Man playbook where any problem can be solved by merely applying more force.

>Does he twirl his mustache as he says "If I can't have it, no one can, nyahahaha!" and villain-exists-stage-left? Presumably he would say, "Let him be king over charred bones and cooked meat. Let him be the king of ashes."
> So Putin would wait to use the nukes until AFTER he's lost, at which point he's, what, using them out of spite? Does he twirl his mustache as he says "If I can't have it, no one can, nyahahaha!" and villain-exists-stage-left? I mean...he's a narcissistic dictator watching his legacy evaporate. People have done dumb spiteful shit for less. I have no idea what the actual probability of that is, but it's not *prima facie* insane to think Putin might not be exactly a rational agent here. He has obviously made some very irrational decisions recently, at the very least, in that he started and is repeatedly doubling down on an obviously terrible war that is ruining his country. I do agree that if Putin did nuke Ukraine, the far more likely outcome is a targeted assassination than a retaliatory nuke, though. Everyone knows Putin himself is the problem, and such an insane move would make him an enemy even of his own people.
> I mean...he's a narcissistic dictator watching his legacy evaporate. People have done dumb spiteful shit for less. I have no idea what the actual probability of that is, but it's not > prima facie > insane to think Putin might not be exactly a rational agent here. I agree completely! I don't think Putin is a perfectly rational agent - I think his actions are internally consistent but are based on a worldview that from our perspective is seriously warped and annoyingly biased. What I was mocking was the idea that he would *wait that long* and only then use nukes. You don't ask your boss for a raise after he fires you; if Putin is going to use nukes he will not wait until Ukraine has already won, which is what the graph seemed to imply. Your second paragraph is part of what I was trying to get at down there in my other reply: there are a lot of moving parts that depend on domestic issues that we have no way of accurately assessing. Even most dictators are beholden at least to a small extent to the people, if only to ensuring the people don't hate them enough to overthrow them. We don't know what the social situation is in a state where they're literally forbidden from calling it a war. And more importantly, we don't know *what Putin thinks it is*, which is more important than actual reality in predicting his moves.
I don't think Putin will use nukes at all, basically for the reason you mentioned: it would turn even his own people against him. But I think that if he \*were\* considering using nukes, presumably because he felt that a loss would be politically even worse for him, he'd still want to put it off for as long as possible (i.e., until he's about to lose), hoping that maybe the situation would change and he'd be able to win without nukes. The difference from the raise example you gave is that asking for a raise isn't normally an extremely dangerous action, and it's more likely to succeed if one does it when one is in good standing rather than when one is about to be fired, but nuking Ukraine risks escalation and domestic unrest, and it pretty much guarantees a win against Ukraine itself (even if the West counterattacks) no matter when it's done. Nuking sooner would result in fewer Russian troops lost (at least in the short run), but it would totally make sense for Putin to see this as less of a benefit than potentially avoiding nukes and the concomitant risks.
Yeah I heard stories (no way to know if true or anti putin propaganda) that he looked at the developments with Gaddafi and others with horror and a sense of 'they are coming for me next'. [Good job western intelligence agencies](https://www.thetimes.co.uk/article/mi6-regrets-helping-vladimir-putin-to-get-elected-says-ex-spy-chief-tbttxxljf)
> Studying political science, especially international relations, is the worst decision one can make, because you understand that the vast, vast majority of people commenting on foreign affairs have less than zero insight and are following some Conan/He-Man playbook where any problem can be solved by merely applying more force. As somebody who has read some OG Conan the barbarian, all this talk of politicial science sounds like womanly wizardry, taste my steel fiend!
>If "David winning" (presumably a decisive, conventional Ukrainian victory), there's a 30% chance of nuclear escalation? If Ukrainian total victory seems imminent, what % do you give to Russia turning to nukes? >So then there's an 80% chance of a CONVENTIONAL NATO retaliation - literally saying 80% chance NATO does anything at all, let alone proportional. If Russia does use nukes on Ukraine(not part of NATO), what % chance is there that you think NATO uses nukes on Russia?
Listen, I'll reply even though I'm not sure this wasn't just an attempt at a 'gotcha'. *I don't give percentages, because I fundamentally disagree with that approach.* Political science has become a lot more quantitative in the past couple of decades, and I had to take a buttload of statistics in grad school, but it doesn't apply everywhere. You want to do a multivariable regression looking at how some factors impact literacy? The efficacy of a government program in reducing poverty? Sure, for most domestic issues you can (and should) quantify your arguments and base them on data. Even in IR, it works sometimes - foreign aid is very quantitative, as are development issues. Anything econ, basically. But absolutely not for something like this. There is no sample size - this is the first time a nuclear power has declared an expansionist, imperialist war on its neighbor in Europe in this millennium. Until the very end most non-pro-Russian analysts strongly doubted this could even happen. We have no point of comparison to even attempt to accurately model an outcome. All numbers, regardless of if you agree or disagree with them, will be arbitrary because we are facing a very strong fog of war. What does this mean? It means that we can't model anything because we have barebones, semi-accurate information regarding the situation. Even if we knew exactly how much artillery, armaments, supplies, etc etc Putin has, and how much of that is in working order, that's still a tiny fraction of the puzzle. How trained are his troops at actually using them in a combat (ie. not training, parade or military Olympics) setting? How many **working** nukes does he have? What is the status of his delivery systems? What is the morale situation like - we've gotten highly conflicting reports? What is the situation in the Kremlin? Among the oligarchs? Among the military leadership? The mercenary factions? The local almost-warlords like Kadyrov? In the press, judiciary, police? With the advanced disinfo campaigns and Putin's fairly successful attempt to stamp out open dissent, we genuinely have no way of accurately gauging how much pushback he's facing privately. Maybe Medvedev and Shoigu and everyone else is breathing down his neck because Russia is on the verge of systemic collapse. Or maybe they're fully aligned and think Putin is doing the right thing (maybe the only course of action left). It's nonsensical to try to give arbitrary numbers with any confidence when the amount we do not know dwarfs what we do know. Hell, the Russians themselves faced this issue early on, with credible reports that there were a number of cases where instead of crates or hangars full of artillery and supplies, the Russian troops found empty space because a local quartermaster sold it and embezzled the profit. You ask me what are the odds that Putin will launch nukes when there's no way of telling how many he has, and in what condition they are? Maintaining a nuclear arsenal is incredibly expensive, time-consuming and requires expert care. The US fucks up fairly often considering how much it invests, you can read up on that if you'd like, I don't think it stretches credibility to assume Russia has a non-zero amount of faulty nukes in its arsenal, and that would certainly impact Putin's calculus. But again, maybe he doesn't know! A common piece of analysis is that he's surrounded himself with yesmen who actively hide uncomfortable truths from him, so maybe they will keep saying things are working fine until the last moment, and Putin ends up looking like Wile E. Coyote when one blows up in its silo. This whole need to place numerical value on incredibly hazy and uncertain courses of events strikes me as strongly counterproductive. You know about the anchoring effect - you will literally have a harder time moving away from thinking of that number as a baseline even if it's proven wrong. What's the point of doing a regression if your R-squared is like 0.2, your confidence interval is both incredibly wide and incredibly low, and you have confounding factors up the wazoo? You can p-hack your way to anything if you're determined enough, as any researcher knows. The only way you can give a number and not have it be meaningless is if you have at least most of the pertinent data - which nobody outside of maybe the intelligence agencies does. Otherwise you're just playing, it's Garbage In Garbage Out only for nuclear war. I fundamentally do not believe in overconfident statistical predictions based on minimal relevant information. I do not believe it's useful, and it can actively be harmful. At best it's a way to stroke your ego. You'll see a lot of twitter hacks do that, but if you look at the top echelon of IR analysts, they tend to use words more often, and for good reason. Ironically, IR/security studies already went through this phase 50+ years ago. Robert McNamara tried to quantify the Vietnam War, and for months and months the numbers said America was winning - they were killing more and dying less, simplistically. And yet the war was no closer to ending. Because he looked at troop and materiel losses and wins, and ignored the fact that the Vietcong weren't perfectly rational androids who will do a cost-benefit calculation and realize they have a slightly higher chance at more consistent long term GDP per capita (PPP) growth if they surrender. They were *people*, who placed value on living in that particular bit of land and ruling over it the way they see fit and on not being a post-colonial pro-US regime, and were willing to die for it a lot more than the Americans were willing to kill for it. One of the key insights in counterinsurgency over the past 20 years has been "never underestimate the willingness of people to fight for their homeland, it's practically a force multiplier" (yes, it took Afghanistan, Iraq, Libya and Syria for the Pentagon to understand this basic insight). So the issue was that he tried to give numbers to the war based on limited information and only looking at a small number of variables, and ran the war that way, and America lost. No algorithm or function can perfectly model the behavior of an individual human, let alone a regime. McNamara learned his lesson since. Watch his movie, "The Fog of War". Better insight than all of the twitter and rationalist community combined. Pay particular attention to 'unknown unknowns', or what other disciplines call black swans. I hope this has been useful in demonstrating why the hyperpositivist rationalist approach to muddy IR security issues is just nonsensical. I think the odds Putin uses nukes (and even then, what does that mean? A single tactical small nuke destroying some infrastructure with minimal civilian casualties is a far cry from carpet nuking Kiev, and yet they both fall under "uses nukes") is low, but any number would be an asspull meant to simplify analysis for fast clicks and big likes. Don't trust people who do that. These are complex problems with a million variables involved and you can't boil them down to "if x, y% that z".
I'm a rationalist who believes in the value of these kinds of numerical predictions (though I'm pretty bad at them myself!), and I'll try to explain what I see as a problem with your argument. You argue convincingly that we have tremendous uncertainty about the various factors that will affect Putin's options and decisions, and thus any model of the situation that we come up should be recognized as shaky and subject to change dramatically (note that Tegmark does say that he intends to update his probabilities regularly, and he welcomes feedback from readers on his numbers, so he is at least trying to avoid becoming wedded to his particular model!). But I don't think you really explain why this means we should use qualitative models instead of quantitative models in this kind of situation. The radical skeptic could say that we shouldn't use either kind of model because there's so much haziness, and we should just cross all our bridges when we come to them (unless of course they're blown up), always focusing on the present. But I think we'd both agree that that's silly: we have a choice to make every day (whether to continue supplying Ukraine) with significant future repercussions, and we have a better chance of things turning out well if we make our choice based on our best (flawed) model of the future than if we throw up our hands and refuse to think about the future at all. If you agree with that, then it seems like you don't really mean that "we can't model anything" in this case because there's "no point of comparison" and any model would be "garbage in, garbage out" or that the only useful model is going to have to account for "a million variables," just that all this is true for \*quantitative\* models. So, why are qualitative models different? You cite the anchoring effect, but this intuitively applies to both quantitative and qualitative models, and Wikipedia seems to confirm this. People are loath to give up their preconceived notions, whether quantitative or qualitative, and policy makers should vigorously guard against this, but I think it's possible to learn to largely avoid this bias, by constantly asking oneself questions that poke holes in one's own views and listening to others who disagree with you, which is largely what the rationalist community is about. If quantitative models really are especially risky for anchoring (but I don't see why they would be), then this justifies especial care to avoid anchoring when using them, but it's just fundamentally gross to me to use an inferior model just because we think we'll be bad at reasoning unbiasedly about one that would otherwise be better, rather than using the most plausible model and trying our darndest to reason correctly about it. Similarly, I don't think quantitative models are obviously more susceptible to bullshirting than qualitative models (in which one can often use various kinds of rhetorical tricks and emotional appeal to obfuscate one's core argument). And I certainly agree that Tegmark is no expert at predicting this kind of niche thing and thus shouldn't be given much weight by himself, but I don't think nonexperts shouldn't give thoughtful quantitative or qualitative arguments about big, complicated issues. Laymen largely have no choice but to defer to experts that they deem trustworthy (who may not be the official "experts," depending on the issue), but many supposed experts don't know as much as they think they do or have different values from certain laymen, so there has to be a process for experts to earn and maintain credibility and trust. I think one part of this needs to be laypeople regularly discussing important big issues, necessarily on a naive and shaky level, but with the aim not of totally understanding them but simply of determining which experts (or schools of thought) are able to provide prima facie plausible answers to natural FAQs and seem to be credible and working for their values. For example, if simple models seem to suggest a high probability of nuclear war, and certain politicians and experts agree with this while the others don't seem to have a reasonable answer (Incidentally, I don't think all this is actually the case!), then it would make sense for a layperson to trust the former more than he did before and consider whether the latter might be biased or blinded in some relevant way. In addition, there's just a lot of joy in thinking about interesting, important problems, and as long as one doesn't get too overconfident, I don't think there's anything wrong with regular people theorizing publicly about a complicated topic. You also argue that the American failure in Vietnam was due to McNamara et. al. being too quantitative, and I can totally buy that the particular models they were using were bad, but it's hard to see how this reflects on quantitative models in general. From your description, it seems like the issue was maybe more that their model was wrong about specific things: the Vietnamese didn't want a democracy if it was going to be under American influence, and they were willing to fight hard to prevent that, even at the cost of GDP. If they hadn't used numbers, why should we think they would have gotten those things right? Why is it easier to think about unknown unknowns with a qualitative model? Perhaps you favor qualitative models that don't incorporate uncertainty (quantitative/qualitative) at all, by coming up with a single best guess as to what Putin will do in response to each one of our present options (maybe based on a certain standard IR theory)? But I don't see why it would be better to have a model that considers only one possible action by Putin in each scenario over one that weighs the different actions he could take. And if you want a model that incorporates different possible responses by Putin, then I think you have to either (a) use quantitative probabilities to express how likely (i.e., unsurprising) these different responses are under this model, (b) use qualitative probabilities ("almost surely," "plausibly," "quite likely," "a small chance," etc.) instead, or (c) avoid modeling any response in your model as more likely than any other. To me it is clear that b and c are worse than a. Qualitative probabilities say the same thing as quantitative probabilities except less clearly. They're still worth using sometimes to avoid unnecessary detail, just like it's worth saying "I was a little late" rather than specifying the exact time, but when one is making a careful, important decision, it's crucial to communicate meaning clearly. When I say that an event has probability A/B to me (i.e., under my best model of the future, as I don't think there's a single objective probability), I mean that to me the event is about as plausible as one of A specific contestants winning a random lottery with B contestants in all, which is a scenario that is pretty clearly defined. A random person will quite likely know what I mean if I say that there's a 60% chance that such-and-such will happen, but it's less clear if I just say it's "quite likely" (see what I did there?). In both cases, I could be full of feces, but I'm making a clearer, more falsifiable claim in the former case. It really seems to me that in a context where one is modeling the future, the only reason to use at least the specific qualitative probability phrases I've been lampooning is when the precision of the model isn't really important (not true when the model is used to determine war strategy) or when one wants (or is part of a culture designed to allow one) to weasel away from having one's predictions checked. Now, you could use more precise phrases that aren't numerical, like "as likely as a getting struck by lightning," "as likely as three heads in a row," but I don't think either of us would prefer that: you'd presumably say that those are still bad because they're ultimately referring to numbers, and I'd say they're bad (in technical contexts) because many people don't really have intuition/knowledge about lightning or betting on coins, but people do talk about numerical probabilities all the time. And what about option (c), to consider different possible responses Putin could take but not assign them any kind of likelihood? I think this is bad because it seems basically equivalent to assigning all possible Putin-responses equal probability, which seems worse than actually thinking about how relatively plausible the various responses are. Tl;dr I think predictive models should generally involve quantitative probabilities: getting rid of them usually results in vaguer qualitative probabilities in their stead, which have all the same issues of relying on limited information, facilitating bullshirting, and leading to anchoring.
>But absolutely not for something like this. There is no sample size - this is the first time a nuclear power has declared an expansionist, imperialist war on its neighbor in Europe in this millennium. Until the very end most non-pro-Russian analysts strongly doubted this could even happen. We have no point of comparison to even attempt to accurately model an outcome. All numbers, regardless of if you agree or disagree with them, will be arbitrary because we are facing a very strong fog of war. The percentages are about making people put their money where their mouth is a little bit, and seeing how accurate people actually are at predicting the course of conflicts. Fox News, CNN, MSNBC, etc. get experts on all the time to talk about the likely outcomes of war, but they don't make firm predictions with percents, so you can't really know which experts are actually the best at predicting outcomes. If Max Tegmark gives percents, 5 years later, we can see if 1/10 of his 10% predictions came true, if 2/10 of his 20% predictions came true, etc. and have a very easy way to see if he's full of BS or good at predicting. Also you typed a lot of words about how everything is fundamentally unknowable, and while I agree we'll have trouble getting accurate numbers, I think we still have to try. I don't know what the alternative is, besides just implicitly using numbers instead of explicitly using them. We have to be aware of what could provoke Russia to use nukes. Say we keep up the strategy of providing Ukraine with weapons and supplies, and it keeps working, and Russia faces total defeat. What happens next? Does Ukraine only take back post-2014 borders? Pre-2014 borders? Grab any additional land as punishment for Russian invasion? Demand war reparations? Demand Russia reform its government and depose Putin? If there was a 0% chance of nukes, at the very least I think forcing the Russian government to reform would be good. But there might not be a 0% of nukes, that's what we have to think about and do our best to place a number, however inaccurate, on.
Why spend your time coming with inaccurate numbers based on garbage info? This is an honest inquiry, since I can think of reasons *not* to do that, those being that not only will this lead to you making bad decisions now based on numbers completely detached from reality, it will also lead to you making bad decisions later, as you'll factor the garbage numbers into whatever future calculations you make when potentially useful information does come to light. If you absolutely want to think about the scenario, game out different outcomes a bit, so when things do start happening you can react in a thought out manner. Just leave the probability function undefined as long as its not useful.
>If you absolutely want to think about the scenario, game out different outcomes a bit, so when things do start happening you can react in a thought out manner. This sounds like it'd just involve thinking about probabilities, just without putting explicit numbers on them. I can game out a scenario likes this: 1. Ukraine keeps winning, in the next couple months: a. Putin will not be deposed b. Putin will be deposed 2. Ukraine starts losing, in the next couple months: a. Putin will be deposed b. Putin will not deposed. How we act now depends on what we think the probability Putin will be deposed in a couple will be. Say we think there's a decent chance of him being deposed, maybe then the West would want to signal that it'd be willing to make a peace without harsh reparations to Russia but Putin must be removed, to further incentivize Russians to depose Putin. But if we think there is a low chance Putin will ever be deposed, signaling that would just make Putin dig in further, so maybe we signal if Russia surrenders now we won't require Putin to step down. Our numbers on whether Putin would be deposed would have to be inaccurate, but we have to make decisions on what to do today based on something, better to make decisions on inaccurate numbers than just pure vibes. Since vibes are really just our intuitions on the numbers that are even more inaccurate.
> This sounds like it'd just involve thinking about probabilities, just without putting explicit numbers on them. The point I'm making is that you don't have to put numbers on the likelihood of any of those outcomes to consider what your best response is if they come to pass. You, personally, can consider what you would do in each of the following scenarios: - Ukraine retakes all land previously held, including crimea, Putin does nothing. - Ukraine as above, Putin reacts by escalating to nuclear warfare. - Ukraine is winning the conventional war, Putin escalates before they fully achieve their objectives. - etc You can do this without assigning probabilities to how likely you think they are. You can also consider which of these are acceptable risks to you (and then realize you have no influence over them, anyway). If, at some point, you get information that allows you to make an informed judgement of the likelihood of these various possibilities, you can then make decisions informed by that as well. But if you preempt that information by assigning probabilities based on garbage, you are just increasing your risk of bad decisions.
>The point I'm making is that you don't have to put numbers on the likelihood of any of those outcomes to consider what your best response is if they come to pass. But you can prepare before the actions come to pass. If you think your scenario 1) has a 99% likelihood, you keep on the pace we're on now. no need to change anything. If you think scenario 2) has a 99% likelihood, you make sure to prepare doctors for how to treat radiation poisoning and make sure everyone knows what to do in the event of a nuclear strike. There's no point in training people for nukes if it won't happen, it'd just be a waste of time and money, but it's very important if it will happen. Maybe you decide you can't come up with an accurate enough number on likelihood of nukes to take action, that your margins of error are too large. But it doesn't hurt to try to come up with a number and then think about what your margins of error are.
The point people are trying to make is that it does hurt to come up with a number based on garbage data. Before you bother trying to come up with an output, assess your inputs. You're also ignoring the other aspects of decision making - at the very least, you should be considering the impact of being wrong. Even if something is very unlikely, it may be worth preparing for.
>The point people are trying to make is that it does hurt to come up with a number based on garbage data. Before you bother trying to come up with an output, assess your inputs. I think if you're aware of the likelihood that you're wrong, and aware of what you would do differently given different numbers, and those factor into your decision making, it's okay. >You're also ignoring the other aspects of decision making - at the very least, you should be considering the impact of being wrong. Even if something is very unlikely, it may be worth preparing for. I'd agree.
Again, you cannot estimate the likelihood of being wrong with garbage data, unless you want to assign it a probability of 1.

I was amused by this response in the replies:

Nuclear Forecasts and How to Judge Them

TL;DR - Ignore forecasts without good track records

Well, let’s see - first, just find a guy who’s been correctly forecasting that there won’t be a nuclear war for every year he’s been alive (surely there must be someone who’s done it!). Perfect track record, so anything else he says must be accurate, right?

(yes, yes, he’s saying “look for people who’ve correctly forecasted other events”. But…I don’t see why someone who accurately predicted e.g. some Covid numbers would therefore suddenly have their finger on the pulse of nuclear geopolitics.)

Some people have the forecasting gene.

[deleted]

Jayed Martin is going to go through life thinking people hate him because he’s a nerd
[Nice model.](https://www.youtube.com/watch?v=0MjpOZT6W9w)
I am biting off my fingers at the first knuckle reading that.
Ow god that is it. Im crawling back into the sea.

It’s literally a Gorka chart with made up numbers.

This could pass as modern art before it would make sense as a chart

Get me the UN! I mean LessWrong!

Goddamnit David I literally logged on to post this

Somebody give that physics dickhead a von Neumann award

The whole thing is nonsense, but I especially like the reasoning here:

view it as highly unlikely (<10%) that Putin would accept “Vietnam” without first going nuclear, because it would almost certainly result in him being overthrown and jailed or killed.

With the dual assumption that a) nuclear blackmail is a viable war-winning strategy, and b) dictators don’t know how to hold on to power.

Or that there is even an alternative to Putin for Russia right now, given that he has killed all would be successors, or opponents.

Sigh, I thought Tegmark was OK when he was writing speculative metaphysics ( what if it was all…like…mathematics, man?). Didn’t realize he was in the LW club, but not surprising.

been in it for years - mutual admiration society, then he got pumping the Bad News about the AI to the silicon valley crowd. He has a very bad book called Life 3.0.
How did he get away with the title *Life 3.0* in 2017? That's, like, 20 years too late. A book called *Life 3.0* should be excerpted in *Wired* magazine and bear laudatory quotes from Ray Kurzweil on its hologrammed dust jacket.
https://i.imgur.com/hO93y7B.jpg “Relativity summed up”: Spanish American War -> Worm Hole -> “Cone of irrelevance” -> Tire wear. This is some time cube shit.

Good news for Max, global nuclear war, just like climate change will not lead to a total extinction of the human race so 10⁵⁸ simulated human in the far future is still possible.

Anyway making clear to dictators that might makes rights and nukes give you the ability take and abuse none nuke neighbours surely will not lead to bad incentives.

(I wonder if max is also a ‘this war was caused by nato expansion’ person becuase that stance isnt really compatible with ‘give the madman with nukes what he wants’ (if you believe the latter you should be for more people in nato to prevent that)).

I’m just reminded of this Brass Eye bit, except that this is meant to be taken seriously.

Yeah things makes sense now.. ot not sense if am going to be honest. What a shit show Max has become. Also from the Swedish article there is nothing Max can say or do to get out if it, the evidence is there in 400K.