this post was submitted on 12 Jun 2024
392 points (95.6% liked)
Technology
59329 readers
4933 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
They could make Siri change its voice and Genmoji based on the degree of certainty of the response:
They could sell different voice packages. Revive the ringtone market.
The AI is confidently wrong, that's the whole problem. If there was an easy way to know if it could be wrong we wouldn't have this discussion
this paper tries to do that: arxiv.org/pdf/2404.04689
there are also several other techniques I think
While it can’t “know” its own confidence level, it can distinguish between general knowledge (12” in 1’) and specialized knowledge that requires supporting sources.
At one point, I had a chatGPT memory designed for it to automatically provide sources for specialized knowledge and it did a pretty good job.