this post was submitted on 15 Feb 2024
622 points (99.1% liked)

Not The Onion

12269 readers
1584 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Comments must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] RobotToaster@mander.xyz 83 points 9 months ago (1 children)

That seems like a stupid argument?

Even if a human employee did that aren't organisations normally vicariously liable?

[–] atx_aquarian@lemmy.world 74 points 9 months ago (1 children)

That's what I thought of, at first. Interestingly, the judge went with the angle of the chatbot being part of their web site, and they're responsible for that info. When they tried to argue that the bot mentioned a link to a page with contradicting info, the judge said users can't be expected to check one part of the site against another part to determine which part is more accurate. Still works in favor of the common person, just a different approach than how I thought about it.

[–] Carighan@lemmy.world 25 points 9 months ago (1 children)

I like this. LLMs are powerful tools, but being rebranded as "AI" and crammed into ~everything is just bullshit.

The more legislation like this happens where the employing entity is responsible for the - lack of - accuracy, the better. At some point they'll notice they cannot guarantee the correct information is the only one provided as that's not how LLMs work in their function as stochastic parrots, and they'll stop using them for a lot of things. Hopefully sooner rather than later.

[–] sukhmel@programming.dev 2 points 9 months ago

This is actually a very good outcome if achievable, leave LLMs to be used where there's nothing important on the line or have humans control them