r/SneerClub archives
newest
bestest
longest
"Patiently awaiting Eliezer Yudkowsky's admonitions" (https://i.redd.it/7olwfenv9d731.png)
21

[deleted]

One is about accumulation of power to decide resource allocations, the other is about curly wires.
[deleted]
There was a recent thread on r/ssc which talked about just this, and it got downvoted, it is really weird. Somebody even explained why AI risk is a bigger threat (according to them), and that just made stuff weirder.
[The science fiction author Charles Stross drew his comparison](http://www.antipope.org/charlie/blog-static/2018/01/dudeyoubrokethefuture.html) He called a corporation a "slow AI"
404s for me :/
[Working link](https://www.antipope.org/charlie/blog-static/2018/01/dude-you-broke-the-future.html)
Scott *does* talk about capitalism negatively in his most famous post (Moloch) in this way (and never again), the difference is that this sort of optimization is not really superintelligent in the same way that the paperclip maximizer would be, and therefore while shitty, not an existential threat (except when it is because climate changs)
Capital is (mostly) accumulated is by providing goods or services that are more valuable that its inputs. The issue with the paperclip maximizer is that it produces outputs less valuable than the inputs. The "maximization" part isn't nearly as important as the direction the maximization is in.

I kinda feel gauche linking SMBCs back-to-back like this, but, well, with an alt-text like that…

(src)