Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

  • @YourNetworkIsHaunted
    link
    English
    72 months ago

    The active hostility from outside the tech world is going to make this one interesting, since unlike crypto this one seems to have a lot of legitimate energy behind it in the industry even as it becomes increasingly apparent that even if the technical capability was there (e.g. the bullshit problems could be solved by throwing enough compute and data at the existing paradigm, which looks increasingly unlikely) there’s no way to do it profitably given the massive costs of training and using these models.

    I wonder if we’re going to see any attempts to optimize existing models for the orgs that have already integrated them in the same way that caching a web page or indexing a database can increase performance without doing a whole rebuild. Nvidia won’t be happy to see the market for GPUs fall off, but OpenAI might have enough users of their existing models that they can keep operating even while dramatically cutting down on new training runs? Does that even make sense, or am I showing my ignorance here?