posted on December 06, 2021 11:03 AM by
u/Sag0Sag0
44
u/typell31 pointsat 1638790430.000000
I don’t know anything about AI but this is just inherently very funny
to me
Furthermore, compared to many other things AIs have learned to do, if
you consider the task of running a responsive text dungeon, it seems
relatively possible to ask a (relatively unusually) introspective human
author to write down their thoughts about how and why they would
generate the next prompt from the user’s input.
well, we don’t have any relevant technical skills, but we
are (relatively unusually) introspective. for this reason i
propose doing ai research via dungeons and dragons
The reason for our focus on this particular project of visible
thoughts isn’t because we believe it to be better or more fruitful than
Circuits-style transparency (we have said for years that Circuits-style
research deserves all possible dollars that can be productively spent on
it), but just because it’s a different approach where it might also be
possible to push progress forward.
this other project is much better, but don’t worry, we have no actual
way of helping with that one, so let’s do this instead
Note that proponents of alignment strategies that involve human-esque
thoughts (such as those linked above) do not necessarily endorse this
particular experiment as testing any of their key uncertainties or
confusions.
ok i know I just implied our project was similar to that other one
but actually there’s no reason to believe this will lead to meaningful
results in that area either
In this case, we tried to execute this project in a closed way in
mid-2021, but work was not proceeding fast enough.
imagining eliezer spending months writing reams and reams of terrible
text adventures and eventually giving up because he got bored
I (Nate) don’t know of any plan for achieving a stellar future that I
believe has much hope worth speaking of. I consider this one of our key
bottlenecks.
this sure is a way to begin a new paragraph
Offering prizes for small projects such as these doesn’t address that
bottleneck directly, and I don’t want to imply that any such projects
are going to be world-saving in their own right.
don’t worry guys, he didn’t want to imply that
we need people who can not only lead those projects themselves, but
who can understand the hope-containing heart of the idea with relatively
little Eliezer-interaction, and develop a vision around it that retains
the shred of hope and doesn’t require constant interaction and
course-correction on our part.
god forbid anyone working for you diverges from Eliezer’s ideas even
slightly
That said, whether or not we decide to pay for a run is entirely and
unilaterally up to Eliezer Yudkowsky or his delegates
The great part is that if anyone does try and write all this
crap, MIRI can just turn around and say “oh this doesn’t meet our
standards, so sorry that you wrote 1 million words worth of shit to try
and claim .2 mil.”
They’re so incompetent that they can’t even attempt to hire people to
do this, or to do a request for proposals… just a handwavey “do the
thing and send to us, and maybe we’ll pay you”. Sure, I’m pretty willing
to do 20K worth of work without any idea if I’ll get paid or not.
(Not to mention how stupid the whole thing sounds)
*beep boop*
> I have a skeleton!
[[Beep booping stops]](https://tenor.com/view/terminator-nope-t1000-finger-wag-no-gif-17843471)
E: [alternative clip](https://www.youtube.com/watch?v=xtfw6iP4Sw0)
That was essentiall the first thing that came to my mind aswell. With an initial budget of 200k they could easily message some well known DnD content creators to ask them if they can use their runs for their research and hire someone to transcribe what happens like in the format they described. They could produce a basic dataset to their liking in a few months with only a fraction of the resources that they are offering here.
Heck, for the more popular DnD podcasts (Critical Role, TAZ), there are already fan transcripts available for a ton of episodes, including the episodes where the creators explain a bunch of the decisions they made in constructing and running their worlds. But I suppose if you already have datasets to work with it's a lot harder to explain why you haven't done anything useful with them...
> But I suppose if you already have datasets to work with it's a lot harder to explain why you haven't done anything useful with them...
Even funnier considering that, as pointed out in the comments under the LessWrong post, the samples Elizier wants are actually too long for GPT-3 to handle. Literally willing to spend over a million dollars on movie script length DnD runs that aren't even useful for an actual experimental run. Imagine if that money actually went to legitimate scientific endevours instead of Elizier's barely disguised excuse to use research money on his fanfic hobby.
Yeah like it's a good idea to plan for the future, getting data that you might not be able to use now but which is on the horizon for what you want to do, and doing so can be expensive, but this is just a little silly.
I don’t know anything about AI but this is just inherently very funny to me
well, we don’t have any relevant technical skills, but we are (relatively unusually) introspective. for this reason i propose doing ai research via dungeons and dragons
this other project is much better, but don’t worry, we have no actual way of helping with that one, so let’s do this instead
ok i know I just implied our project was similar to that other one but actually there’s no reason to believe this will lead to meaningful results in that area either
imagining eliezer spending months writing reams and reams of terrible text adventures and eventually giving up because he got bored
this sure is a way to begin a new paragraph
don’t worry guys, he didn’t want to imply that
god forbid anyone working for you diverges from Eliezer’s ideas even slightly
definitely not a cult btw
Aren’t they worried the AI will convince the dungeon master to let them out
Jesus fucking Christ
The great part is that if anyone does try and write all this crap, MIRI can just turn around and say “oh this doesn’t meet our standards, so sorry that you wrote 1 million words worth of shit to try and claim .2 mil.”
They’re so incompetent that they can’t even attempt to hire people to do this, or to do a request for proposals… just a handwavey “do the thing and send to us, and maybe we’ll pay you”. Sure, I’m pretty willing to do 20K worth of work without any idea if I’ll get paid or not.
(Not to mention how stupid the whole thing sounds)
Effectivest altruism to donate to a non-profit for playing Zork.
And this is why the AGI hates skeletons.
Can’t they just like
Copy the captions from Critical Role or something
Holy Shit why do they have so much money for this nonsense …
Head empty