Now that the software is running with (at least for me) a low level of jank, it seems worth considering what we do with the years of accumulated sneer-strata over at the old place. Just speaking for myself, I think it would be nice if we had a static-site backup of the whole shindig. Unfortunately, since I’m a physicist by trade, anything I do with webstuff tends to involve starting from scratch with compass, straightedge and wget. There’s got to be a better method of archiving.
The other, not-mutually-exclusive option I can think of is to manually rerun “SneerClub classics”, the posts that one way or another helped define what sneering is all about.
N.B. Some of the test posts made today involved writing more on the serious-discussion side and have accordingly been marked NSFW.
RationalWiki runs on MediaWiki, which is kind of awful for discussion threads.
I will try to have more thoughts about this later (and do a bit more research into pre-existing scraping tools and such).
https://github.com/toonvandeputte/reddit_archive this might be adaptable into something that archives a subreddit instead of a single user’s posts. there may be a slight complication if we’re dealing with more than 1000 posts though. but since we’ve got an archive already that has everything up to december, we really only need this year’s posts