My wife works at a library. People constantly come in asking to use the library fax machine, because Google’s AI says they have one.
They don’t have one. Their website says they don’t have one. But LLMs have determined it’s plausible for libraries to have a fax machine, so Google tells people that they have one.
You’d be surprised at the number of people who can’t accept that people working at the library know more about the library than Google.
I’ve had this experience myself; I’m an American living in the Netherlands and sometimes just don’t know the name for the thing I need nor where to buy one. LLM bots are fine for the translation part, but they will make wild assumptions like telling me I can buy a kitchen strainer at the hardware store or food spices at a place called Kruidvat which translates to spice-bucket basically but is actually most like CVS without the pharmacy and does not sell any food besides some candy and chips.
It’s hilarious how quickly these bots can swing from super useful to actually harmful to trust.
Lol I’m sure that would end badly. Remember that QAnon crazy guy that fired shots at that pizza restaurant demanding to see the basement where they kept the kids tied up?
Google is basically a conspiracy theorist peddler at the point.
Well of course. Telling me im wrong is an attack on me. Im good not bad so dont deserve to be attacked. So anyone attacking me is bad and a liar and erong.
Also very lengthy all the time. It can’t really summarise without rambling. It’s in no way succinct and too chatty. Verbose, you might say. Garrulous even. Loquacious, if you prefer. Needs to pipe down. Can’t rein it in. Doesn’t shut the fuck up.
They think people use A Lot of words because websites use A Lot of words because if they don’t then google deems the total page content lesser and drops its ranking (the ol “personal blog entry on a recipe” trope)
The machine whaargaarbles to please the humans based on the humans who wharrrgarbled to please the machine.
_Oh wow, I was reading some other websites to summarize this one, and apparently their competitor (who happened to pay for ad placement on this websites name plus url) says this website throws puppies into volcanos!
Oh, and you need to eat 6 rocks a day for your nutrition.
Was this summary helpful, or do you also throw puppies into volcanos?_
“But how about I just summarize it for you instead… poorly and with a few lies added in?”
My wife works at a library. People constantly come in asking to use the library fax machine, because Google’s AI says they have one.
They don’t have one. Their website says they don’t have one. But LLMs have determined it’s plausible for libraries to have a fax machine, so Google tells people that they have one.
You’d be surprised at the number of people who can’t accept that people working at the library know more about the library than Google.
I’ve had this experience myself; I’m an American living in the Netherlands and sometimes just don’t know the name for the thing I need nor where to buy one. LLM bots are fine for the translation part, but they will make wild assumptions like telling me I can buy a kitchen strainer at the hardware store or food spices at a place called Kruidvat which translates to spice-bucket basically but is actually most like CVS without the pharmacy and does not sell any food besides some candy and chips.
It’s hilarious how quickly these bots can swing from super useful to actually harmful to trust.
They should get one! That’s a very normal thing for a library to have.
She never told them to use AI to locate and operate the fax machine?
Lol I’m sure that would end badly. Remember that QAnon crazy guy that fired shots at that pizza restaurant demanding to see the basement where they kept the kids tied up?
Google is basically a conspiracy theorist peddler at the point.
You’re right. Some might end up typing phone numbers into the alarm system and cram papers into the heating vents.
Well of course. Telling me im wrong is an attack on me. Im good not bad so dont deserve to be attacked. So anyone attacking me is bad and a liar and erong.
Also very lengthy all the time. It can’t really summarise without rambling. It’s in no way succinct and too chatty. Verbose, you might say. Garrulous even. Loquacious, if you prefer. Needs to pipe down. Can’t rein it in. Doesn’t shut the fuck up.
Let’s dig into that.
They think people use A Lot of words because websites use A Lot of words because if they don’t then google deems the total page content lesser and drops its ranking (the ol “personal blog entry on a recipe” trope)
The machine whaargaarbles to please the humans based on the humans who wharrrgarbled to please the machine.
_Oh wow, I was reading some other websites to summarize this one, and apparently their competitor (who happened to pay for ad placement on this websites name plus url) says this website throws puppies into volcanos!
Oh, and you need to eat 6 rocks a day for your nutrition.
Was this summary helpful, or do you also throw puppies into volcanos?_