216
this post was submitted on 05 Nov 2023
216 points (94.3% liked)
Technology
60082 readers
3839 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
AI legally can't create its own copywritable content. Indeed, it can not learn. It can only produce models that we tune on datasets. Those datasets being copywritten content. Im a little tired of the anthropomorphizing of ais. They are statistical models not children.
No sir, I didn't copy this book, I trained ten thousand ants to eat cereal but only after running an ink well and then a maze that I got them to move through in a way that deposits the ink where I need it to be in order to copy this book.
The AI isn't being accused of copyright infringement. Nothing is being anthropomorphized.
Wether you write a copy of a book with a pen, or type it into a keyboard, or photograph every page, or scan it with a machine learning model is completely irrelevant. The question is - did you (the human using the pen/keyboard/camera/ai model) break the law?
I'd argue no, but other people disagree. It'll be interesting to see where the courts side on it. And perhaps more importantly, wether new legislation is written to change copyright law.
That's called learning. You learn by taking in information, then you use that information to produce something new.
It isn't. Statistical models do not learn. That's just how we anthropomorphic them. They bias.
You could say the same about humans.
no, you literally can not. Maybe if you were a techbro that doesn't really understand how the underlying systems work but you have seen sci-fi and want to use that to describe the current state of technology.
but you're still wrong if you try.
Yes, you literally can. At the very deepest level, neural networks work in essentially the same way actual neurons do. All "learning," artificial or not, is biasing the interconnections and firing rates between nodes "biasing" them for desired outputs.
Humans are a lot more complicated in terms of size and architecture. Our processing has many more layers of abstraction and processing (understanding, emotion, and who knows what else). But fundamentally the same process is occuring: inputs + rewards = biases. Inputs + biases = outputs.
they do not, neural networks were inspired by neurons, it's a wild oversimplification of both neural networks and neurons to state that hey work the same way, they do not. This is the kind of thing the sci-fi watching tech bros will say, but it's incorrect to say.