this post was submitted on 10 May 2025
900 points (97.8% liked)
Technology
69910 readers
2205 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
the slippery slope here is that you as an artist hear music on the radio, in movies and TV, commercials. All this hearing music is training your brain. If an AI company just plugged in an FM radio and learned from that music I'm sure that a lawsuit could start to make it that no one could listen to anyone's music without being tainted.
That feels categorically different unless AI has legal standing as a person. We're talking about training LLMs, there's not anything more than people using computers going on here.
So then anyone who uses a computer to make music would be in violation?
Or is it some amount of computer generated content? How many notes? If its not a sample of a song, how does one know how much of those notes are attributed to which artist being stolen from?
What if I have someone else listen to a song and they generate a few bars of a song for me? Is it different that a computer listened and then generated output?
To me it sounds like artists were open to some types of violations but not others. If an AI model listened to the radio most of these issues go away unless we are saying that humans who listen to music and write similar songs are OK but people who write music using computers who calculate the statistically most common song are breaking the law.
Potentially yes, if you use existing IP to make music, doing it with a computer isn't going to change anything about how the law works. It does get super complicated and there's ambiguity depending on the specifics, but mostly if you do it a not obvious way and no one knows how you did it you're going to be fine, anything other than that you will potentially get sued, even if whatever you did was a legally permissible use of the IP. Rightsholders generally hate when anyone who isn't them tries to make money off their IP regardless of how they try to do it or whether they have a right to do it unless they paid for a license.
That sounds like a setup to only go after those you can make money from and not actually protecting IP.
By definition if your song is a hit it is heard by everyone. How do we show my new song is a direct consequence of hearing X song while your new song isn't due to you hearing X song?
I can see an easy lawsuit by putting out a song and then claiming that anyone who heard it "learned" how to play their new album this way. The fact AI can output something that sounds different than any individual song it learned from means we can claim nearly all works derivative.