this post was submitted on 06 Nov 2023
123 points (100.0% liked)
World News
22059 readers
30 users here now
Breaking news from around the world.
News that is American but has an international facet may also be posted here.
Guidelines for submissions:
- Where possible, post the original source of information.
- If there is a paywall, you can use alternative sources or provide an archive.today, 12ft.io, etc. link in the body.
- Do not editorialize titles. Preserve the original title when possible; edits for clarity are fine.
- Do not post ragebait or shock stories. These will be removed.
- Do not post tabloid or blogspam stories. These will be removed.
- Social media should be a source of last resort.
These guidelines will be enforced on a know-it-when-I-see-it basis.
For US News, see the US News community.
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Plenty of actual photographs exist with Palestinian children wielding rifles and Hamas headbands. Perhaps the AI is just trained with those images as well?
Why does it matter what the excuse is?
You shouldn't get a stereotype (or in this case I suppose propaganda?) when you give a neutral prompt.
To me, it should only “matter” for technical reasons - to help find the root of the problem and fix it at the source. If your roof is leaking, then fix the roof. Don’t become an expert on where to place the buckets.
You’re right, though. It doesn’t matter in terms of excusing or justifying anything. It shouldn’t have been allowed to happen in the first place.
I do agree that technical mistakes are interesting but with AI the answer seems to always be creator bias. Whether it's incomplete training sets or (one-sidedly) moderated results, it doesn't really matter. It pushes the narrative to certain direction, and people trust AIs to be impartial because they presume it's just a machine that interprets reality when it never is.
...as seen by the machine.
It's amazing how easily people seem to forget that last part; they wouldn't trust a person to be perfectly impartial, but somehow they expect an AI to be.
It's amazing how easily people seem to forget that machines uses tools its creator provides. You can't trust AI to be impartial because it never is as it is a collection of multiple choices made by people.
This is such a bore, having this same conversation over and over. Same thing happened with NFTs and whatever is currently at the height of its tech hype cycle. Don't buy into the hype and realize both AIs potential and shortcomings.