this post was submitted on 27 May 2024
1101 points (98.1% liked)

Technology

59404 readers
3185 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

You know how Google's new feature called AI Overviews is prone to spitting out wildly incorrect answers to search queries? In one instance, AI Overviews told a user to use glue on pizza to make sure the cheese won't slide off (pssst...please don't do this.)

Well, according to an interview at The Vergewith Google CEO Sundar Pichai published earlier this week, just before criticism of the outputs really took off, these "hallucinations" are an "inherent feature" of  AI large language models (LLM), which is what drives AI Overviews, and this feature "is still an unsolved problem."

(page 5) 50 comments
sorted by: hot top controversial new old
[–] dog_@lemmy.world 5 points 5 months ago

I have a solution! It's called "getting rid of it" :D

[–] shotgun_crab@lemmy.world 5 points 5 months ago (1 children)

Yeah no shit, that's what LLMs do

load more comments (1 replies)
[–] pineapple_pizza@lemmy.dexlit.xyz 4 points 5 months ago (2 children)

I mean, it did say non toxic glue. Which is technically edible. /S

load more comments (2 replies)
[–] Andromxda@lemmy.dbzer0.com 4 points 5 months ago

You mean hallucinations like this one?

[–] jaybone@lemmy.world 4 points 5 months ago

But this week’s debacle shows the risk that adding AI – which has a tendency to confidently state false information – could undermine Google’s reputation as the trusted source to search for information online.

🤣🤣🤣

[–] tonytins@pawb.social 3 points 5 months ago

The problem with all these chat AIs is that they're just a gloried autocorrect. It never knew what it was saying from the beginning. That's why it "hallucinates".

load more comments
view more: ‹ prev next ›