this post was submitted on 27 May 2025
501 points (90.2% liked)
Technology
70704 readers
6130 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Its not a bit deal if you aren't completely stupid, I don't use LLMs to learn topics I know nothing about, but I do use them to assist me in figuring out solutions to things I'm somewhat familiar with. In my case I find it easy catch incorrect info, and even if I don't catch it most of the time if you just occasionally tell it to double check what it said it self corrects.
It is a big deal. There is thr whole set of ways humans can gauge validity of the info, that are perpendicular to the way we interact with fancy autocomplete.
Every single word might be false, with no pattern to it. So if you can and do check it, you just wasting your time and humanity's resources instead of finding the info yourself in the first place. If you don't, or if you think you do, it's even worse, you are being fed lies and believe them extra hard.