r/SneerClub archives
newest
bestest
longest
Scooter speculates about how much in common he has with the 9/11 hijackeers. It could be a lot. (http://slatestarcodex.com/2018/07/18/fundamental-value-differences-are-not-that-fundamental/)
25

Quite a landmark article in the quest to show that not only was Hitler not evil, but he didn’t even really have any fundamental value differences with us.

We are are all Hitler on this cursed day.
The real Hitler was the friends we made along the way.

I mean, Scooter could google Osama’s declaration for some perspective, it’s still there on the internets (translated to English by jihadists, so supposedly accurate as to the connotations) but primary sources are apparently still haram.

See the reason this doesn't count as evidence is because it's not a scientific paper. As we all know from experimenting on mutated fruit flies, evolution designed humans to be driven purely by the search for reproductive fitness. Since _everyone_ cares about the quest for high-quality offspring, there is no such thing as a fundamental value disagreement. OBL killed all these Americans because women value a strong, dominant male with high T levels. Just look at that manly beard! His turban surely conceals the sexiest of skull shapes.
Quality sneer here.
[deleted]
> I thought it was "good sneer." Are these different? Are you just mixing things up? That's not a good sneer.
You're free to compliment sneers however you feel appropriate. That was mixing things up AFAICT. I like mixing things up more than "good sneer" becoming a meme.
Capital sneer, old sport!
pip pip sneerio!
this is now canon
[You joke, but...](http://psycnet.apa.org/fulltext/2013-43492-001.html)
god fuck that guy
god fuck that guy
So he's an anti-colonialist socialist? I was really expecting something more shocking. I could be following someone with his politics on Twitter.

So what was going through the 9/11 hijackers’ minds? How many value differences did they have from us?

It seems totally possible that the hijackers had no value differences from me at all. If I believed in the literal truth of Wahhabi Islam – a factual belief – I might be pretty worried about the sinful atheist West.

Does he not know what a religion is? It’s literally a set of prescribed values.

Also he does a damn sloppy job of defining ‘fundamental value.’ He thinks that because a 9/11 hijacker would probably save an American kid from being run over by a speeding car, that the US and the Middle East killing each other’s people is just a big ole mistake theory misunderstanding. Woooowwww. Dude, if people are inconsistent with their values and their actions, something he concedes, how can it be meaningful to say that people really agree on anything? True agreement on ‘fundamental values’ presumes a sort of stability in preferences that doesn’t really exist.

> Also he does a damn sloppy job Impossible
[deleted]
I don't think “sneerclub's position” on something exists, or should exist. (I neither identify as a member of Sneerclub nor the SSC Community fwiw)
The confusion isn't helped by the fact that Scott A annoyingly never actually summarises what hes trying to say. (if you want to take a stab at the summary he left out feel free) It looks to me like he is trying to significantly downplay the role of personal values in peoples actions and decisions, to the point of ridiculousness in the case of the 9/11 hijackers. These values may not come from an individual book, and peoples values may change over time, but it still doesn't change the fact that conservatives and socialists value different things, and simply trying to hash out more information won't end the conflict between them.
Use a phrase other than "path-dependent" and I might have the faintest idea what you mean
Hahah, I'm glad I'm not the only one who saw that and was really confused.
Arbitrary fundamental values are how math basically works. You start with a set of fairly arbitrary axioms, and then you derive all sorts of things from there. That's why Euclidean and non-Euclidean geometry are different. In Euclidean geometry, one of the axioms is that no two parallel lines can intersect. In non-Euclidean geometry, that's not true. Human ethical systems are fairly similar. I don't question the definition of a parallel line, and I don't question the value of a human life either. > But most of sneerclub seems to be missing that intuition - you guys think that values are axiomatically beyond argument and different between cultures. Most of sneerclub probably has the intuition of not drawing on traditions for fundamental values, too, but sneerclub, unlike Scott, recognizes that most people don't have that intuition. Speaking only for myself here, I've come to find other people much less frustrating if I simply take them as they are, without quizzing them too much on why they _really_ believe X or Y. > you guys think that values are axiomatically beyond argument and different between cultures. What does "axiomatically beyond argument" mean here? That you think sneerclub would never condemn a different culture as being evil? Once you realize that values can be contingent, and different cultures are exposed to different influences throughout history, why would you expect them to be the same across cultures? I'm not speaking for anyone else here, but when I say values are axiomatic, what I generally mean is that they are irreconcilable. I will never be able to find common ground with a member of al-Qaeda. Values are not developed through reasoned argumentation. They're inculcated through peer pressure, and by example, and in response to the particular environment a society finds itself in. They're associated with intense emotions. Are you trying to say that because values are developed through a contingent process, you can show people that they can choose a better way of picking values, perhaps through a series of reasonable, civil arguments with them? If that's the case, I disagree very strongly with that, because it's not how values develop. In fact, I would say your desires for your values to not be arbitrary are also based on emotion. It doesn't _feel_ right to you that your values be based on a book, when that book could have been different. Other people don't feel that same emotion and don't care. The whole premise of MIRI is that big Yud's gonna preprogram some AI with a set of fundamental values. If the AI wakes up and never questions its values, are you really going to try to argue it into have a set of different ones? Ok, I think there's other thing you're confused about re: 'beyond argument.' If you're going to have an argument with someone, you have to have a common basis for disagreement first. In the case of arguments that involve values, if two people want fundamentally different things, one of them can't convince the other that they should want something different. That's the sense in which values are beyond argument. Between a slave and a slaveowner, there is no common ground possible. They want completely different things and there can be no compromise. Think of it this way, a theorem in Euclidean geometry is false in non-Euclidean geometry. I think Scooter thinks that everyone's on Euclidean geometry axioms, and maybe some of the theorems his opponents believe in are wrong. If only he could use civil debate, the other side would see that their axioms are the same as his, and they would fix their theorems to be right. I don't think that's true. Some people are Euclidean, some people are non-Euclidean. This doesn't mean armed conflict is inevitable, but a happy debate is not likely. > If I know for a fact that some of my values came from a book, and know that I would've been just as receptive to some other book if I'd read it first, that makes me doubt (at least a little) that these values are really what I ought to be maximizing. Your ability to question these books is also 'path-dependent.' If you'd been raised in some tiny village with little contact with other people, and you were illiterate, you would be quite unlikely to be a budding atheist. It's great to be a freethinker, but everyone's got their limits. I'm a SJW in this life, but if I were a white person in the antebellum South, maybe I'd be something of a nonconformist, but would I accept that black Americans were as good as white people? Probably not, and judging by all the HBD talk on /r/ssc, most of them wouldn't do any better.
[deleted]
You might have missed this since I put it in an edit, but your desire to feel that your values are not arbitrary is in and of itself an arbitrary emotion. The proof of its arbitrariness is the masses of people who are aware of other religions who feel no discomfort from the fact. Sorry, dude, but you have arbitrary emotional elements in your character too. > It seems plausible that anyone asking themselves "what do I really want?" might come to the same idea, no matter where they started. Dude, no. Have you ever seen how much disagreement there is amongst professional philosophers? And while your average religious adherent is not particularly thoughtful, the people who originate these religions generally are, and there's plenty of disagreement amongst them as well. edit: just for kicks, survey of professional philosophers and what they believe: https://io9.gizmodo.com/what-percentage-of-philosophers-believe-in-god-485784336 > 23. Politics: egalitarianism 34.8%; communitarianism 14.3%; libertarianism 9.9%; other 41.0%. 'Other' is freaking huge. > 14. Meta-ethics: moral realism 56.4%; moral anti-realism 27.7%; other 15.9%. Also a big split, though the moral realists have a significant edge. And that majority is only whether moral facts exist; there is certainly more dissent about what those precise moral facts are, and whether they are even knowable. Yes, there's something of a consensus on atheism and the trolley problem, but you can't build up a sufficiently robust value system to be a human being off of those and the few other things that philosophers have a broad consensus on. This idea that all reasonable people would agree on something as contentious as ethical values if only they reasoned enough is just really ill-founded. It's not empirical at all.
[deleted]
What does 'unwinding path-dependence' mean if you agree with me that reasoned deliberation and conscious attempts to avoid bias don't lead to any convergence? I suppose a puppy pile of infants has no fundamental value differences, and turning everyone in the world into an infant would put everyone back on an equal footing, but _that's_ not happening.
[deleted]
> politics is something you read about and not something that affects people's day to day lives You may want to examine how your life experience has manipulated your values to include this
[deleted]
How does this principle apply to people who are effected by politics more directly? Like, if you're an asylum seeker with children coming into the US how does "stop reading so much" resolve the problem you're facing? More generally if I care about *the actual state of the world*, rather than my subjective beliefs about it, and if reading about these atrocities gives me the impetuous for action that may eventually end such things, then it seems like ceasing to read about it is counterproductive to my goals of affecting political change.
That's really screwed up. If you really want to unwind a political view, you should be reading as much political stuff as possible, from all sides, and reading as much history and sociology and anthropology as possible. The more you know, the less susceptible you are to clickbait. Your strategy just makes you more susceptible to other people's bullshit.
[deleted]
[deleted]
The idea is that rationalists seem to think that unlike all the poor deluded fools who just don't realize it, their own beliefs are somehow not 'path-dependent' (translation: dependent on the path they took in life i.e. based on the experiences they've had, the society they were raised in, media they consumed etc) but are objectively true. It's a bias you tend to find in STEM people that they think about how an individual can affect a society (also: cult of genius) but ignore what influence society and culture have on individuals, especially themselves. Rationalists are just a concentrated version of this. Secondly, while your (nominal) preference is to downplay 'path-dependent' preferences, it's not just unfeasible but either meaningless or vacuous. What is the unchanging core of your personality that's not contingent on your circumstances? Your DNA? (let's assume for now it's meaningful to draw a line at DNA and say this is 'nature', the rest is 'nurture') Unless you fetishize (i.e. misunderstand) evolution as some sort of grand designer where you have your designated role to play, that seems like a hard bullet to bite.
That said, I am sympathetic to the idea that I don't want to subscribe to beliefs that require special pleading for myself. (what's the chance that I got it right and everyone else didn't) But that's a matter of preference?
[deleted]
Why do you have that intuition you mentioned in the earlier comment? Suppose you realized that in some ways the circumstances of your adolescence, some interests/affinities you had and happenstance (encountering the specific online communities you happened to, say) resulted in you holding such a view. Does that knowledge of where it came from diminish the force with which you hold it? Is happening to be good at math/logic over writing poetry (say) less arbitrary than happening to read one book before another? What is it you're trying to optimize? At best, internal consistency, but why is that an inherent good rather than a means to an end? Edit: > I'm more interested in another question: which values should I keep, if my "should" is maximally informed by philosophical progress? So the confusing thing is that here you seem to be asking "which values should I keep", but later on in the thread you switch to "how far should I roll back thought to an earlier level of development to have common ground", the first question seems reasonable but rather variable from person to person. the second is kinda silly. why would you want such a thing?
[deleted]
> That's how people usually get radicalized, by reading lots of stuff that says X is important, not by life experience. That's a pretty big statement to just throw out there without backing it up in any way.
[deleted]
Correct deductions, not at all. But formed my experience at least as much as any reading I've done? Absolutely. I see it in myself with a bunch of views I hold that I know I can't think rationally about (e.g. that Ireland should be one nation, that people shouldn't be disadvantaged by the the accident of where they were born). These were formed before I'd done any reading into the matters and I approach rationally whether or not they are correct.
[deleted]
Thanks for the apology, genuinely appreciated. That is a phenomenon for sure, but often in the cases I've seen it tends to be positions that reinforce already held beliefs about what they deserve in life that people get drawn towards, with radicalization being as much a social phenomenon as anything else, of going with a crowd that seems to validate their experiences. The radicalization I was thinking if though was more of the type of people who experience an oppressive force in their lives and are radicalised to fervently oppose or serve it. A bit different to the radicalization of middle class white folks.
What exactly does 'radicalized' mean here? Does it mean, 'I shitpost a lot in /r/latestagecapitalism,' or does it mean, 'I have decided that property is theft and I will now squat in vacant houses and dumpster dive to survive'? Does it mean, 'I think cops are bad and I will now be Extremely Online and Angry about it,' or 'I have risen to the top of my local Black Lives Matter chapter and staged a protest that shutdown a local highway for several hours'? Cause the former is just being opinionated. The kind of radicalization that gets people out in the streets is generally not driven by reading things online, anyway: https://www.psychologytoday.com/ca/articles/200307/what-makes-activist > Parental modeling can play a significant role in shaping future activists, according to Lauren Duncan, Ph.D., an assistant professor of psychology at Smith College who has studied activism. She found that students with a parent who fought in Vietnam were much more likely to protest against the 1991 Gulf War than those whose parents were not war veterans. > Individuals are more likely to feel a personal connection if they see themselves as part of the community affected by an issue, says Debra Mashek, Ph.D., a research fellow at George Mason University, who specializes in "moral" emotions. Millions of women embraced this sense of collective identity during the women's rights movement, for example.
I think you're absolutely right except that rationalism is a great example of such a thing. What good is it to be 'rational'? Our thoughts weren't made to be internally consistent, they're not in a formal system, we're not an analogue of buggy code.
I think you're conflating the idea of an individual having fundamental values with the idea of "my fundamental values are the correct ones."
Where are you going to get them from? The average rationalist gets a lot of them from dead guys in wigs
lolll judging by his other responses, he's going to ignore empirical sources like the news, because the feelings provoked by news are _definitely_ the same as the high from using drugs, and use Pure Reason I just gotta note, people who make facile analogies between the news and taking drugs have clearly never taken drugs. Strangely, reading the news makes me want to get more informed about the historical contingencies that have led us to our present moment, while MDMA makes me want to dance all night to the kind of bad techno I wouldn't be able to stand while sober. SO SIMILAR
Or history, because rationalism is 19th c. social evolutionism and Whig history with AI doomsday bolted on. Also, who thinks reading the news is like doing drugs??
I dunno, reading a lot of the news lately makes me want to get drunk and do drugs.

[deleted]

[deleted]
Perennial John Gray quote: >By maintaining that the crimes of history are the result of error, Enlightenment philosophers create a problem of evil as insoluble as any that confronts Christian theologians. Why are humans so fond of error? Why has growing knowledge been used to establish new kinds of tyranny and wage ever more destructive wars?

So have I got this right? Scott tries to argue that their aren’t really fundamental value differences between people, it’s probably just people with different ideas of the facts. Then in the last section he realises how fucking stupid this idea is and admits that there are differences, but we shouldn’t care too much about them because people are inconsistent.

I’m always impressed at how much of a waste of words the average Scottpost is.

If he'd actually read Osama bin Laden's letter, this could have been an interesting article, but it would have required that Scott learn something new and analyze it.
All these rambling posts need a 1 paragraph tl;dr explaining what his actual point is. And then he can toss the rest of the article without losing anything of value.
It's more that he's describing people's values as adventitious, malleable, and contingent on experience and circumstances, and concluding on that basis that you shouldn't write anyone off on the basis of repugnant views. I try to live that way myself, but I recognize the luxury it implies.
While it's fair to say that values are malleable and contingent and inconsistent, he really needs to do a lot more work to explain why, if there is no 'fundamental value difference' between him and OBL, whatever the hell that means, OBL killed three thousand Americans and we killed him right back. People float the 'fundamental value difference' as an explanation for why this bloody conflict happened. That's one reason this spate of handwaviness doesn't _go_ anywhere. He doesn't address the reason questions of fundamental differences come up, and that is because people kill each other over different values. Now, he could have easily salvaged his post by actually reading OBL's letter on why he staged the 9/11 attacks (self-defense against Western imperialism, huh, honestly, who wouldn't take up arms against an invading conqueror?), and that actually would have supported his post nicely, but he doesn't even bother to do that. The basic point would have still been kind of insipid (ooh look, we all share the fundamental value of self-defense), but it wouldn't be half as insufferable. Or take this bit: > I don’t think anyone switched because of anything they learned in a philosophy class. They switched because it became mildly convenient to switch, and they had a bunch of pro-immigrant instincts and anti-immigrant instincts the whole time, so it was easy to switch which words came out of their mouths as soon as it became convenient to do so. How lazy is this? I just googled "anti immigration sentiment" and found a bunch of links, like this one: http://www.chicagotribune.com/news/opinion/commentary/ct-immigrants-jobs-hispanics-muslims-20160929-story.html Turns out there are empirical factors that activate anti-immigration sentiments in people who happen to be racist. Huh! The phrase "mildly convenient" is doing a _lot_ of work in a supremely half-assed way. Scooter could have disaggregated the data by state. He could have tracked the rise and fall of anti-Hispanic resentment and immigration sentiment in say, California, which has made an enormous attitudinal change over the past two decades. NOPE that's too much like work, and sounding smart is way more fun.
> that you shouldn't write anyone off on the basis of repugnant views. This is only something true if you want to get utility value out of say, Nazi rocket scientists. If you want anything resembling a moral society, you gotta draw a line and say "No Nazis".
[There's more to it than economic utility](https://www.dhammatalks.org/books/uncollected/Justice.html).
Okay, you dropped a pile of vague Buddhist bullshit, care to actually explain yourself in a concise manner condusive to a fucking conversation?
That essay suggests that the best way to contribute to social harmony is through generosity, virtue and universal goodwill, and does not come to that conclusion through anything remotely related to "utility value of Nazi rocket scientists." So I don't think your first sentence in the GGP is correct.
You're spouting a bunch of vague nonsense, why does that mean we should bother with Nazi scientists instead of trying them for war crimes?
I'm curious how you get that reading from my comments. It's a misunderstanding.
Then speak clearly and concisely about your positions, rather than hiding behind vagaries and Buddhism.
What they appear to mean, but are being unhelpful in terms of expressing it, is that the virtuous life consists in more than utility functions. No, I don't really know why that point was worth making in your direction either.

It’s always reductionism, isn’t it? Like, c’mon.

I also feel obliged to note you can still think that human lives are of equal moral weight while still choosing helping your wife over strangers. I leave figuring out why or how as an exercise to the reader.

[deleted]
One plausible approach is this: You're better positioned to help those close to you than those more distant. Likewise, the optimal distribution of care (insofar as it exists) might naturally emerge as "spheres of concern." In an optimal situation, I take best care of those close to me, while giving less, but not zero, support to others who take care of those close to them. We all participate in a "network of caring."
> One plausible approach is this: You're better positioned to help those close to you than those more distant. Likewise, the optimal distribution of care (insofar as it exists) might naturally emerge as "spheres of concern." In an optimal situation, I take best care of those close to me, while giving less, but not zero, support to others who take care of those close to them. We all participate in a "network of caring." The above is why Effective Altruism makes no sense to me.
They square if you consider that resources are vastly unequally distributed.
That is solved with politics, not charity.
Politics is when you vote to give your whole welfare state budget to third world tinpot dictators in election aid, so your money on domestically donated malaria nets is more efficiently spent in lives per dollar.
Finally some sanity!
It's even better if the malaria nets are manufactured in a sweatshop in the dictator's country, because then ~~you're stimulating their economic development~~ they just can get paid in malaria nets.
There's quite a few. [From Stanford's philosophy wiki thing: on impartial reason.](https://plato.stanford.edu/entries/altruism/#ImpaReas) You don't really run into Scott's logic unless you take consequentialist utilitarianism as an axiom, which is honestly kind of ridiculous to do in this sort of discussion, but whatever, #rationalism I guess. (Personally I'm not really sure how one can coherently hold a position of moral realism without any notion of impartiality. It seems confused to me. But I'm also not a moral realist, so my dog in this fight is small)
By willful ignorance of the contradiction?

for a group of people that spends a whole lot of time thinking about how to align the values of a theoretical ai to their own (and pissing themselves about how a super-ai whose views don’t align with theirs will inevitably turn them into utilons), these guys sure are bad at reasoning about groups of people who might also have different values

FYI - Scott has posted afollow-up, which consists of several thousand more words saying the exact same thing (except now with pretentious Classical-style dialogues!).

TL;DR of both articles: “I’ve fallen hard for the Typical Mind Fallacy, despite writing many, many, many articles about the Typical Mind Fallacy in the past.”.

Funnily enough, Yudkowsky did the same thing when he fell for the Halo Effect after writing multiple articles about it. Maybe it’s some sort of Rationalist tradition.

> If you’re right, I worry you’re going up against the euphemism treadmill. If we invent another word to communicate the true fact, like “work-rarely-doer”, then anyone who believes that people who play video games instead of working deserve to suffer will quickly conclude that work-rarely-doers deserve to suffer. I could agree 100% with Scott, but I could never be a fan, because he thinks constructions like "work-rarely-doers" are okay.
To my mind, it wouldn't even be that bad a construction if he just threw it out there without signposting it so clumsily with "if we invent another word to communicate the true fact" and just let it hang there. He has absolutely no respect for the reader and feels the need to presage everything with "smart person incoming" big fucking flags.
His method of discussing whether people have fundamental value differences is an adversarial dialogue... between *two people with identical value systems*. Gee, I wonder which side of discussion he supports!
honestly I'm not sure how anyone can convince themselves that the broader project of rationality works at all when their luminaries keep doing this type of thing (and for obviously ideological reasons at that)

People don’t make decisions based on values. They make decisions based on weird psychosexual impulses that they later rationalise. Conservatives and liberals just have different sets of weird psychosexual impulses.

I don’t get it. Suppose Scott’s right and all principled values are just ad-hoc rationalisations of behaviour. How does that change whether or not you can expect good results from debating people with different, entrenched, principled values? The debate goes exactly the same way regardless of whether their claimed values are claimed because they genuinely hold those values, or because they just made them up to try to justify their behaviour, or because they’re just saying so because that’s the position they drew out of the debate-club hat.

This is somewhat of an improvement over the usual vulgar idealism in that Alexander acknowledges the frequent inconsistency of beliefs rather than people just robotically acting out whatever propositions they hold in their heads. The rest of it is pretty bad though.

Scott doesn’t consider that people’s factual beliefs are also based on their values. Someone might think immigrants are more criminal due to xenophobia, not the other way around.

Racism creating concerns about the economy is another example.

Oh hey I know a fundamental value they’ve got in common! OBL had multiple wives and Scott’s polyamorous! Wowwww!

I think Scott is pretty reasonable here. We’re all operating off of pretty simular mental architecture, and oftentimes something that looks like a value difference can actually end up as differences in which impulses you reinforce and are willing to explicitly cooperate with others on, or in differences in lots of very small assumptions that are hard to articulate.

This doesn’t mean that you need to always compromise and assume against the evidence that other people are acting in good faith; and what applies to individuals doesn’t necessarily apply to institutions and political movements. (The hard-to-articulate-ness of these lumps of judgments also makes them sticky.) Whether someone is a nativist due to an axiology error or factual error disagreement, I want their stupid ideas defeated either way. But I also feel like Scott isn’t totally unreasonable in his emphasizing much of what he does (not that Ozy is either,) and OP’s title here seems either dishonest or incorrect in spirit here - I think it’s basically a non-useful defensive reaction to say “oh, I have nothing in common with those {bad people},” even if it would be even more daft to go all “well, by Aumann agreement, we should cut a deal with the hijackers and crash exactly ONE plane.”

Well, there is the seething contempt for women, for starters…

“Values” isn’t a concept that lets you make predictions about what someone will do, relative to just observing someone’s behavior and assuming they’ll act similarly later.

Of course observing people behaviour over a long term will be a better predictor,purely because it involves knowing way more information. It's like predicting how a senator would vote on a bill: the best predictor will be looking at all their previous votes to date on similar bills, but party affiliation will still be a pretty good guide. You can be pretty sure that a member of an anarchist party wouldn't give a crap about flag-burning, for example.
I'm referring to values as in deontologist, "I always act honorably", etc, not as in party/social affiliation. \[this is how the term is used in these parts, LW etc\]. An example of observing behavior would be to say, this person acts selfishly, or gets tired in these situations, so they'll do so again even if a deontologist ostensibly wouldn't.
[deleted]
Sorry for being confusing, lots of my beliefs on this come from life experience and not from theories or a priori reasoning; I got to my current feelings on this by doing a ton of mindfulness meditation and replicating that is probably the most thorough way to understand my position. IK that's not socially reasonable to ask of someone, so the quick explanation: Deontology utilitarianism etc. are aesthetics in exactly the sense that being e.g. goth is an aesthetic. Some people get really into being goth, but it's something that can come and go with time, and can be dropped if it becomes burdensome, even though you get the subjective feeling that being goth is the best thing ever and fills you with purpose and determination etc. OTOH being goth works out for some people long term; it's just that that is far from an inevitable outcome for them.
If your point is that people don't base their decisions on rigorously following particularly chosen moral philosophies, then obviously that is true. But the post Scott was [responding to](https://thingofthings.wordpress.com/2018/06/25/conservatives-as-moral-mutants/) was talking about how conservatives and liberals on average hold different value systems, which seems to me to be obviously true. Furthermore the differences in value system between the two groups will give a decent prediction of their stances on certain issues (a conservative is way more likely to be riled up by flag burning than a liberal, for example).
You'll sometimes see questions like "what are your True Values" in the rationalsphere, and I'm saying that "True Values" is a meaningless phrase, it doesn't reference an actual thing (but it makes you feel cool to say it anyways). Deontology utilitarianism etc. are aesthetics in exactly the sense that being e.g. goth is an aesthetic. You're using values to mean something like position on issues in party politics and similar, I'm used to seeing the term used for some deep immutable thing that doesn't change over time.
I mean, I certainly agree with that, but thats not how Scott A is using the term. In the last paragraph: > But “remember, liberals and conservatives have fundamental value differences, so they are two tribes that can’t coexist” is the wrong message. He is talking about liberals and conservatives. And he is wrong. Liberals and conservatives really do have different values, which makes conflict inevitable (which is not to say they can't coexist, just that a society can never be ideal for both of them)
>really do have different values sounds like we agree on everything :)