• 0 Posts
  • 32 Comments
Joined 3 years ago
cake
Cake day: July 1st, 2023

help-circle
  • Actually, just because the document is not awesomely formatted, and because mentioning the thanking part undersells the contradiction:

    Epstein: is there a 501 c3 that i could give the 50k to/?
    Goertzel: Yes: the Singularity Institute for AI (redacted).
    Epstein: please send details i will send 50k monday.
    Goertzel: many many thanks! … You won’t regret it ;) The AI we build will thank you one day! I am driving now and will send details when I get home





  • Risk and limitations: the study is inherently risky. While it is highly likely that STIs that alter female sexual behavior exist in the wider mammalian order, whether or not they current infect humans remains unclear. Challenges exist in successfully culturing newly identified STIs and adapting microbes to standardized lab models for testing. Finally, any new STIs will be relatively easy to test for efficacy in animals but costly and otherwise challenging to test in humans, and it is possible that success in animal models will not translate into human efficacy. Risks can be mitigated by simultaneously conducting animal and human studies, increasing the probability of identifying at least a single mammalian agent that modifies female sexual behavior.

    Fucking terrifying.


  • I’m sure many of them are somewhat accidental ladder climbers, but looking at some of the names in the list:

    • Herr: At MIT, just had yet another Ted talk in 2018 about his exoskeleton work
    • Shotwell: 2018 Business Insider named her the “Most powerful female engineer”, Musk lackey
    • Li: ex-Google, 2019 became Stanford co-director of their AI hole and “won” a bunch of prizes from various places
    • Zucman: Published book on tax havens 2019, loads of media work to sell the book, award from Le Monde, billed as “No. 1 enemy of billionaires” (paraphrased)
    • de Roode: previous Ted talks, 2018 time magazine list for top 50 in healthcare (a sharp fall from his listing in 2014 of the 100 most influential people?)
    • Mac: bunch of lectures in 2019 after his 2017 pulitzer, 2018 tv circuit including Colbert, a bunch more awards, 2019 Broadway play that had seven Tony nominations, etc.
    • Topol: 2019 book on AI and med, bunch of media for that to sell the book

    These were just the ones I looked at out of curiosity for what they were up to around 2018-2019. There’s clearly the TESCREAL theme across the list, but it strikes me that there’s also a lot of very active PR/promotion effort across the board. Award nominations don’t exactly spontaneously generate from impartial awe-struck onlookers, and media book reviews aren’t chosen based on literary merit.

    The inclusion of Mac especially is what made me think this might just be a slightly wider list of candidates a grotesque parasite would want to ingratiate themselves with by inviting them give private lectures - he doesn’t strike me as a great fit otherwise.


  • That was my take as well. It’s basically anyone in academia/tech who had a PR machine working for them at the time, and a couple of weird extras.

    How Gromov only landed the underwhelming summary of “American” is interesting, I assume the list copy paste was cut short and the next word was “mathematician”.

    If these people did all end up in the same location it’s probably safe to assume it was a private and unpublicized event. Some of them seem to have been in and around silicon valley at the time, so maybe one of the tech fake charity “foundation” events.


  • Who needs pure AI model collapse when you can have journalists give it a more human touch? I caught this snippet from the Australian ABC about the latest Epstein files drop

    screenshot of ABC result in Google search  listing wrong Boris for search term '23andme Boris nikolic'

    The Google AI summary does indeed highlight Boris Nikolić the fashion designer if you search for only that name. But I’m assuming this journalist was using ChatGPT, because if you see the Google summary, it very prominently lists his death in 2008. And it’s surprisingly correct! A successful scraping of Wikipedia by Gemini, amazing.

    But the Epstein email was sent in 2016.

    Dors the journalist perhaps think it more likely is the Boris Nikolić who is the biotech VC, former advisor for Bill Gates and named in Epstein’s will as the “successor executor”? Info literally all in the third Google result, even in the woeful state of modern Google. Pushed past the fold by the AI feature about the wrong guy, but not exactly buried enough for a journalist to have any excuse.






  • Well, just copy and pasted rather than written. I would have hoped that infra read-level permission, infra write-level permission and admin interface permissions were all separate to begin with, even if the person who spun up the instance obviously has all three.

    You do need a level of trust in an admin, of course, but wide open text boxes for putting in code are a questionable system design choice, in my opinion. It adds an extra point of possible entry that then relies on the security of the overall admin interface instead of limiting it to what should require highest level infra admin permissions to access. And if it is something that would be limited to someone who has those, then what is the actual utility of having a textarea for it in the first place?



  • Excellent job on taking care of Lester. I can tell he’s in caring hands and I hope you both have many wonderful (and URTI-free, fingers crossed for that) years together.

    I’d say never feel silly about a vet visit. Even if why you booked it is no longer an issue (which is definitely something that can and does happen for any pet owner), you can always use the time to pick their brains, learn new things and build a good relationship with them.





  • Amazon’s latest round of 16k layoffs for AWS was called “Project Dawn” internally, and the public line is that the layoffs are because of increased AI use. AI has become useful, but as a way to conceal business failure. They’re not cutting jobs because their financials are in the shitter, oh no, it’s because they’re just too amazing at being efficient. So efficient they sent the corporate fake condolences email before informing the people they’re firing, referencing a blog post they hadn’t yet published.

    It’s Schrodinger’s Success. You can neither prove nor disprove the effects of AI on the decision, or if the layoffs are an indication of good management or fundamental mismanagement. And the media buys into it with headlines like “Amazon axes 16,000 jobs as it pushes AI and efficiency” that are distinctly ambivalent on how 16k people could possibly have been redundant in a tech company that’s supposed to be a beacon of automation.