this post was submitted on 02 Jul 2023
133 points (95.2% liked)
Technology
59223 readers
3179 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
AI is not human. It doesn’t learn like a human. It mathematically uses what it’s seen before to statistically find what comes next.
AI isn’t learning, it’s just regurgitating the content it was fed in different ways
But is the output original? That’s the real question here. If humans are allowed to learn from information publicly available, why can’t AI?
No, it isn’t original. Output of AI is just reorganized content that it already has seen.
AI doesn’t learn, it doesn’t create derivative works. It’s nothing more than reshuffling what it’s already seen, to the point that it will frequently use phrases pulled directly from training data.
You are saying that it isn’t original content because AI can’t be original. I’m saying if the content isn’t distinguishable from original content, and can’t be directly traced to the source, in what way is it not original?
Because it’s still not creating anything. AI can’t create, it just reorganizes.
I think you hear a lot of college students say the same thing about their original work.
What I need to see is output from an AI, and the original content side by side and say “yeah, the AI ripped this off”. If you can’t do that, then the AI is effectively emulating human learning.
I think you hear a lot of college students say the same thing about their original work.
What I need to see is output from an AI, and the original content side by side and say “yeah, the AI ripped this off”. If you can’t do that, then the AI is effectively emulating human learning.
No it isn’t
AI is math. That’s it. This over humanization is crazy scary that people can’t see the difference. It does not learn like a human.
https://www.vice.com/en/article/m7gznn/ai-spits-out-exact-copies-of-training-images-real-people-logos-researchers-find
https://techcrunch.com/2022/12/13/image-generating-ai-can-copy-and-paste-from-training-data-raising-ip-concerns/amp/
https://www.technologyreview.com/2023/02/03/1067786/ai-models-spit-out-photos-of-real-people-and-copyrighted-images/amp/
It’s a well established problem. Tech companies have explicitly told employees to not use these services on company hardware or servers. The data is not abstracted from the user and it’s been proven to output data that’s been inputted.