this post was submitted on 28 Apr 2024
231 points (93.3% liked)
Technology
59404 readers
3185 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
There are already stories about companies being sued because their AI gave advice that caused the customer to act in a manner detrimental to themselves. (Something about 'plane flight refunds being available if I remember correctly).
Then when they contacted the company to complain (perhaps get the promised refund), they were told that there was no such policy at their company. The customer had screenshots. The AI had wholesale hallucinated a policy.
We all know how this is going to go. AI left, right and centre until it costs companies more in AI hallucination lawsuits than it does to employ people to do it.
And all the while they'll be ~~bribing~~ lobbying government representatives to make AI hallucination lawsuits not a thing. Or less of a thing.
On the other hand, are you implying that human call center workers are accurate with what they tell customers and that when they make mistakes the companies will own up to them and honor them?
I mean that's generally how it is now. If a rep lied to me then you better believe I'm talking to the manager and going to extract some concession. The difference is you can hold a rep accountable, dunno how you do that with AI