• @sc_griffith
    link
    English
    161 year ago

    this is such a funny grift. hope ceos are torturing themselves over whether the random noise interpreters will like them. imagine an exec staring in the mirror repeating a line over and over to develop the right intonation to fool ai

    • @Send_me_nude_girls@feddit.de
      link
      fedilink
      English
      71 year ago

      Just train a model with your voice and never speak a real word on your own ever again. Call it voice purists. It’s going to happen.

      • @sc_griffith
        link
        English
        111 year ago

        I’m sorry, but that won’t help your earnings call. As soon as you give it a few microseconds of voice data, the model will simulate your life from first principles and find out your company is fucked. you think the ai is going to throw that information away? every exquisite subvocal pang of agony will be reproduced. there’s only one thing to do. the only way out is through. show up so blitzed out on coke you don’t even know you’re in an earnings call. you have to do it. it’s called charging the fucking machine gun nest man. our grandparents knew about this before they got all wrapped up in this tech shit. that’s what they taught you in world war two. they didn’t even know what a phone was back then. can you imagine? that’s fucking wild man. and now you have chatgpt and it’s smarter than half the people I know. that’s fucking wild. life! chatgpt. how do I buy a machine gun

        • @selfA
          link
          English
          41 year ago

          ah, you’ve known some of the same type of idiot executives I have

    • @carlitoscohones
      link
      English
      61 year ago

      I’m imagining that last Tesla earnings call with Musk holding 2 soup cans in a flop sweat.

  • @bitofhope
    link
    English
    81 year ago

    LLMs are so notoriously terrible at telling truth from lies that “AI hallucination” is a household phrase at this point, for better or for worse. But surely they work even better when asked to rate the truthfulness of things that are not in their corpus to begin with.

  • @sue_me_please
    link
    English
    8
    edit-2
    1 year ago

    What about an AI that can tell if that cute candidate our startup hired will sleep with me or if she’ll just lie and say yes and then tell HR?

    And while we’re at it can we make an LLM that will force my kids to call me?

  • @selfA
    link
    English
    71 year ago

    is there a pseudoscience that VCs and promptfans aren’t trying to turn into a startup? we’ve got medical woo everywhere, AI startups are essentially repackaging everything from race science to mediums into a bullshit product, and now we’ve got this superstitious crap. there’s a drinking game somewhere in all this where you pull a random RW page and take a shot if there isn’t a startup trying to monetize the article’s subject

    • David GerardOPMA
      link
      English
      51 year ago

      there’s a drinking game somewhere in all this where you pull a random RW page and take a shot if there isn’t a startup trying to monetize the article’s subject

      perfect

  • Leigh Garland
    link
    fedilink
    61 year ago

    @dgerard I mean, god forbid VCs did even the most basic due diligence. No, we’ll use AI to tell if this good news is true or not!

  • @gerikson
    link
    English
    61 year ago

    The “solution” is simple - just LLM a virtual actor that can impersonate the CEO, read a script and answer questions with flawless confidence.

  • @swlabr
    link
    English
    61 year ago

    Random thought: earnings calls are like streams. Buying/selling stock is subbing/unsubbing. Asking questions is superchatting/donating with a message. AI sentiment analysis is crazed fans hyperanalysing the stream to confirm whatever conspiracy they have about the streamer.

    NB: i don’t partake in stream culture

    • David GerardOPMA
      link
      English
      51 year ago

      an earnings call is very like a stream, yes

      • @swlabr
        link
        English
        51 year ago

        Just as cringe, for sure

  • @locallynonlinear
    link
    English
    61 year ago

    We need to filter people who exhibit voice stress, because no one likes a person with the humility of taking uncertainty seriously.

    • David GerardOPMA
      link
      English
      41 year ago

      we need to filter anyone who uses earnings calls for anything other than comedy

  • @froztbyte
    link
    English
    51 year ago

    I’d register fuckedcompany.ai but I happened to discover some years back that .ai didn’t allow saying fuck in the domain name. goddamn tyranny

    but there’s some real revivalist potential for fuckedcompany in all this dogshit