Rationalists spend a lot of time obsessing about prediction markets, even though I haven’t seen much evidence that prediction markets (especially the toy markets they prefer) do a particularly good job of getting at the truth. As a corollary to this, there is also a certain “betting culture” that often crops up in debate on rationalist websites: any argument can be pre-empted at any moment by the demand that you put down money on your position (typically at some absurd odds-ratio that is guaranteed to be a financial disaster for the person accepting the bet.)
In the past I’ve mostly viewed the latter phenomenon as a kind of dominance game: it’s a winning rhetorical move to make your opponent take a stupid bet. If they take it, you win. If they don’t, you win. Presumably any community built around “being right on the Internet” is going to optimize for rhetorical techniques like this, rather than trying to get at the actual truth.
But this does not explain the prediction market obsession, and I’m wondering if anyone has theories about what’s going on there?
I think it also appeals to the quantification fetish that the Rationalist community has. It sounds more math-y to say “I have invested in a prediction market on Y proposition” than it does to simply say “I believe Y” or even “I believe Y is X’% likely to be true”, which is the other standard. And you can pretend to apply all kinds of fun economic and financial analysis tools, which feels really math-y.
There’s also probably a misreading of Nassim Taleb’s Skin in the Game there, since I doubt that participation in one of these toy markets represents a non-negligible amount to lose for basically any participant.
There have been plenty of good answers here, but on an emotional level (that often isn’t recognized as emotional), a person who always needs to be right is often terrified by uncertainty and abstraction. Prediction markets point at the possibility that everything is knowable, which is soothing.
“Terrified” might seem like a funny word because rarely does an always-right seem afraid. That’s often because their defense mechanism for everything is trying to intellectually dominate it, so instead of looking afraid, they look smug and self-satisfied, secure that they are above the unknowable.
Another factor is that they have a strong libertarian streak and they fetishize markets to the point they think they are magic. A market of prediction bets therefore would be a powerful epistemic tool in their logic. Never mind that that real world markets often require strong regulations and oversight to stop them from spiraling in insane directions.
Fetishizing of famous scientific bets.
That’s all this is: these dudes read about the epic bets between Penrose and Hawking and that’s basically the long and short of it.
Having traded commodities, I wish them the best of luck. Especially since the trading is now dominated by algorithms, quants, and a spread that makes it pretty much untenable unless you work for a big house.
“markets as perfect sources of information” has been a popular belief among fools and jerks throughout the last century or so. like everything rationalist, they claim to have discovered another idea boring old conservatives already invented.
[deleted]
I think there’s also a connection to their agi mythos. In particular they believe that an AGI would be able to accurately predict the future, and since agis are just math, it should also be possible for humans to predict the future with math.
I think Rationalists subscribe to the subjective interpretation of probability – the interpretation that allows assigning probabilities to any unknown event based on “degree of belief” – and that interpretation is best understood through betting.
In fact, one could say that Rationalist define their notion of rationality as displaying betting behaviour that is consistent with subjective probability. Indeed, they make frequent allusions to Bayesian reasoning.
I think it’s just a result of being libertarian-adjacent. Markets are the hammer and any problem becomes the nail in rationalist ethos - prediction markets being one instance of this “invisible hand” that runs popular in the circle.
Market failures (such as Keynesian beauty contest) are not something that exists in the Rationalist world, even market failures are purportedly “rational”.
“Rationalist” as a self-descriptor implies that others aren’t rational. Speciifically, LWers are obsessed with the idea that most people have no concern for truth and just profess to believe what’s convenient or socially acceptable. Naturally, it has a lot of echo with the right wing discourse around “virtue signaling” and “NPCs”, and with the EA thing about money as the “unit of caring” (if you really were a good person you’d have the most money possible spent on charity).
So the idea there is that when money is on the line you cut through the bullshit and see what people really believe.
Considering the (suggested, im not an neuroscientist) links between neurodivergence and addiction, it also seems a bit unhealthy for their own followers to focus so much on this.
It is a bit like they looked at markets, Taleb’s skin in the game, and gamification and went ‘we gotta make this worse!’
what’s going on? they call themselves “rationalists” because they’re obsessed with applying methods of pure reason to everything. they love betting markets because they love Bayesian probability because they love quantified models of the world.
it’s a fetish. it shouldn’t be taken seriously.
So, I’ve said before I don’t really have much of a problem with prediction markets. I think they’re generally accurate, provided a) there is a real incentive to “win”, ie, real money or equivalent motivating rewards and b) there is broad participation from a set of diverse perspectives – so, something like IEM, or even a sports book in Vegas generally qualifies.
I view play-money sites like Manifold Markets as a poor simulacra of an authentic prediction market. Many of the markets there have poorly defined payout conditions, are unknowable even in principle (“In 2024, how many named hurricanes in the Atlantic will there be?”) can be directly controlled by the participants (“what will I have for breakfast tomorrow?”), are often self-referential or even paradoxical (“this market will only pay out if ‘No’ wins”), and of course the market participants skew heavily towards a particular demographic, so huge amounts of rationalist group-think abounds. So I see no reason to take these markets as evidence of anything.
As for the cause – I can’t agree that it’s a “dominance game” or some sort of hypercapitalist ideology (“only people with wealth have valid opinions”) – since it’s all play money. I just view it as another instance of Rationalists pretending to be rational when they’re really not; indulging in having a form of rationalism but denying the substance thereof.
It short-circuits debate in the manner you describe, and it was promoted heavily by one of /theirguys/, Robin Hanson.
In Australia we have this ad campaign that is solely based around conflating the words “market” and “meerkat” so now I propose the prediction meerkat being a sub mascot
This is pretty standard for all ideologies that fetishized markets. Actual evidence of how markets perform is not only irrelevant, it isn’t even considered.
there’s already that guy who lost the million bet on Twitter over Bitcoin. I would love to take the other side of a bet with these fools.
https://noahpinion.substack.com/p/in-which-balaji-gives-away-at-least
Prediction markets /should/ work, if there is enough liquidity and the market is efficient (and if we are talking of some reasonably forecastable event, and not questions like “Will Donald Trump smile in his mugshot?”.
The idea that the forecast produced by a prediction market is a consistent estimator of the probability of some event is a cool one. It gives us a tool to predict future events. For good forecasters, it is a way to earn a bit money (and have fun). And there is a very easy metric (total earnings) that we can use to compare forecasters - telling if somebody is good or bad at predicting the future.
And I guess that the ‘obsession’ you mention is due to a combination of factors - there is some research backing them, there is the gambling aspect, the competitive side, the fact that they can be built onchain (therefore, decentralized and not controllable by regulatory bodies). Tbh, it is one of the few things on which I agree with rats :)
Prediction markets have been a topic of interest for rationalists and prediction enthusiasts for some time, and while there may be debates about their effectiveness, they continue to gain popularity. One possible explanation for this could be the potential for prediction markets to provide more accurate forecasts than other methods, especially when it comes to complex events like elections or stock prices. Additionally, the use of prediction markets in decision-making can incentivize participants to share their knowledge and expertise.
As for the “betting culture” that sometimes arises in these discussions, it’s worth noting that prediction markets can be used for more than just gambling. In fact, many platforms are designed specifically for traders and investors who are looking to hedge their risks or make informed decisions about their investments. One example of this is Polkamarkets, a decentralized prediction market built on the Polkadot network, which aims to provide a safe, transparent, and accessible platform for traders and investors to engage in prediction markets.
Overall, it seems that the popularity of prediction markets is likely driven by a combination of factors, including the potential for more accurate forecasts, the ability to incentivize knowledge sharing, and the opportunity for traders and investors to hedge their risks and make informed decisions.
What is the nature of the beef with rationalism? By rationalists do you mean every human being that self identifies as a rationalist, or is the implication that only the specific subset of rationalists who checks the boxes tied to criticisms offered on this sub are who are being referred to? At that point, wouldn’t it be just as useful to differentiate and delineate if generally speaking the rationalist community isn’t homogeneous in temperament, position or belief? Admittedly my experience is anecdotal and therefor not indicative, but I just don’t see the universalities outlined in this sub much as fundamental to the rationalist position expressed so consistently on LW.
I’m new to this whole space and just want to understand how this sub isn’t the opposite side of the same coin LW is typically associated with.
Is a prediction market, in the idealized case, still only as good at predicting things as the people who use it? Or is there some mechanism that’s supposed to ensure accuracy beyond that?
I have the impression that the betting thing is targeted at creating and strengthening ties with the group… Can you get out of the agreement of a bet that might never get settled? Well, yes, if you decide to pay or figure out that you were manipulated into it…
This looks like something that a cult would do to create commitment in its members. Besides, betting will make it harder for someone to change their mind about things.