Too be honest, I hope they win. While I my passion is technology, I am not a fan of artificial intelligence at all! Decision-making is best left up to the human being. I can see where AI has its place like in gaming or some other things but to mainstream it and use it to decide who's resume is going to be viewed and/or who will be hired; hell no.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
I'm not against artificial intelligence, it could be a very valuable tool, but that's nowhere near a valid reason to break laws as OpenAI has done, that's why I too hope authors win.
Can’t reply directly to @OldGreyTroll@kbin.social because of that “language” bug, as well. This is an interesting argument. I would imagine that the AI does not have the ability to follow plagiarism rules. Does it even credit sources? I've seen plenty of complaints from students getting in trouble because anti cheating software flags their original work as plagiarism. More importantly I really believe we need to take a firm stance on what is ethical to feed into chat gpt. Right now it's the wild west.
Good, hope they win.
I don't really understand why people are so upset by this. Except for people who train networks based on someone's stolen art style, people shouldn't be getting mad at this. OpenAI has practically the entire internet as its source, so GPT is going to have so much information that any specific author barely has an effect on the output. OpenAI isn't stealing peoples art because they are not copying the artwork, they are using it to train models. imagine getting sued for looking at reference artwork before creating artwork.
Unless you provide for personhood to those statistical inference models the analogy falls flat.
We're talking about a corporation using copyrighted data to feed their database to create a product.
If you were ever in a copyright negotiation you'd see that everything is relevant: intended use, audience size, sample size, projected income, length of usage, mode of transmission, quality etc.
They've negotiated none of it and worst of all they commercialised it. I'd consider that being in real trouble.
Not to mention, if we're going to judge them based on personhood, then companies need to be treating it like a person. They can't have it both ways. Either pay it a fair human wage for its work, or it isn't a person.
Frankly, the fact that the follow-up question would be "well what's it going to do with the money?" tells us it isn't a person.