this post was submitted on 26 Jul 2023
663 points (99.3% liked)

Technology

34828 readers
19 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] trimmerfrost@lemm.ee 2 points 1 year ago (3 children)

They must be having proof readers

[–] MstrDialUp@lemm.ee 10 points 1 year ago

That's optimistic.

[–] ToastyWaffles@lemmygrad.ml 6 points 1 year ago (1 children)

You know that's not how this works.

[–] itchy_lizard@feddit.it 1 points 1 year ago

No, that's exactly how this stuff works. Lay off 80% of writers and keep all your fact checkers and editors.

[–] salient_one@lemmy.villa-straylight.social 4 points 1 year ago* (last edited 1 year ago)

Probably, though it might be too optimistic to assume that. However, I believe it will still result in more mistakes simply because it's harder to spot errors in an existing text than to not put errors in the text in the first place by fact-checking beforehand and then having another person proof-read.

One of the reasons for that is that LLMs don't feel guilty when they hallucinate while most humans don't like to lie or be too lazy to fact check, and even if they don't care about that, they still have to think about getting caught and damaging their reputation, which again LLMs don't have. And you can't call stating something false as a fact in an article an honest mistake (it's negligence at best) unlike an editor's missing something (due to a looming deadline, perhaps), especially when it's assumed there won't be too many hallucinations, which isn't a certainty.