• RedFox
    link
    fedilink
    English
    737 months ago

    Sorr, but I love the double sided hypocrisy here.

    Here’s a chatbot instead of a person, listen to it since we won’t take your calls. But, we don’t honor what is says!

    Thanks Canadian court for giving us a rare middle finger to the business.

    • 520
      link
      fedilink
      337 months ago

      Not only that, they set a precedent that will hugely discourage the use of LLM chatbots too. Great for us humans though

    • RedFox
      link
      fedilink
      English
      16
      edit-2
      7 months ago

      I bet they make so much money too…

      Overpaid lawyer 1: Fight this or settle?

      Overpaid lawyer 2: Let’s fight this, I have a good feeling about it…

      Overpaid lawyer 1: This won’t set a precedent or anything right…right…

    • @BearOfaTime@lemm.ee
      link
      fedilink
      English
      157 months ago

      Right?

      And the customer service benefit they would’ve gotten from just eating a few hundred dollars.

      But they were being extra greedy, and thinking they could establish precedent… Well they did, just not how they wanted.

  • @MeatsOfRage@lemmy.world
    link
    fedilink
    English
    277 months ago

    According to Air Canada, Moffatt never should have trusted the chatbot and the airline should not be liable for the chatbot’s misleading information because Air Canada essentially argued that “the chatbot is a separate legal entity that is responsible for its own actions,” a court order said.

    That’s some business class horse shit right there, glad they got taken to task over this

  • @jballs@sh.itjust.works
    link
    fedilink
    English
    257 months ago

    Good on the guy for taking screenshots. I’m sure if he hadn’t and claimed the AI Chatbot told him something, the company would have mysteriously lost the logs.