this post was submitted on 17 Apr 2024
109 points (91.6% liked)

Technology

59378 readers
3185 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] polygon6121@lemmy.world 1 points 7 months ago (1 children)

The model makes decisions thinking it is right, but for whatever reason can't see a firetruck or stopsign or misidentifies the object.. you know almost like how a human hallucinating would perceive something from external sensory that is not there.

I don't mind giving it another term, but "being wrong" is misleading. But you are correct in the sense that it depends on every given case..

[–] Syntha@sh.itjust.works 1 points 7 months ago (1 children)

No, the model isn't "thinking", no model in use today has anything resembling an internal cognitive process. It is making a prediction. A covid test is predicting whether you have the Covid-19 virus inside you or not. If its prediction contradicts your biological state, it is wrong. If an object recognition algorithm does not predict there being a firetruck, how is that not being wrong in the same way?

[–] polygon6121@lemmy.world 1 points 7 months ago

Predicting? Ok, if you say so.