this post was submitted on 27 May 2025
544 points (97.9% liked)

Technology

70551 readers
3886 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 
  • Nick Clegg, former Meta executive and UK Deputy Prime Minister, has reiterated a familiar line when it comes to AI and artist consent.
  • He said that any push for consent would “basically kill” the AI industry.
  • Clegg added that the sheer volume of data that AI is trained on makes it “implausible” to ask for consent.
(page 3) 44 comments
sorted by: hot top controversial new old
[–] CosmoNova@lemmy.world 188 points 1 week ago (5 children)

If abiding to the law destroys your business then you are a criminal. Simple as.

[–] Even_Adder@lemmy.dbzer0.com 8 points 1 week ago* (last edited 1 week ago) (3 children)

But the law is largely the reverse. It only denies use of copyright works in certain ways. Using things "without permission" forms the bedrock on which artistic expression and free speech are built upon.

AI training isn’t only for mega-corporations. Setting up barriers like these only benefit the ultra-wealthy and will end with corporations gaining a monopoly of a public technology by making it prohibitively expensive and cumbersome for regular folks. What the people writing this article want would mean the end of open access to competitive, corporate-independent tools and would jeopardize research, reviews, reverse engineering, and even indexing information. They want you to believe that analyzing things without permission somehow goes against copyright, when in reality, fair use is a part of copyright law, and the reason our discourse isn’t wholly controlled by mega-corporations and the rich.

I recommend reading this article by Kit Walsh, and this one by Tory Noble staff attorneys at the EFF, this one by Katherine Klosek, the director of information policy and federal relations at the Association of Research Libraries, and these two by Cory Doctorow.

[–] ICastFist@programming.dev 15 points 1 week ago (1 children)

They want you to believe that analyzing things without permission somehow goes against copyright, when in reality, fair use is a part of copyright law, and the reason our discourse isn’t wholly controlled by mega-corporations and the rich.

Ok, but is training an AI so it can plagiarize, often verbatim or with extreme visual accuracy, fair use? I see the 2 first articles argue that it is, but they don't mention the many cases where the crawlers and scrappers ignored rules set up to tell them to piss off. That would certainly invalidate several cases of fair use

Instead of charging for everything they scrap, law should force them to release all their data and training sets for free. "But they spent money and time and resources!" So did everyone who created the stuff they're using for their training, so they can fuck off.

The article by Tory also says these things:

This facilitates the creation of art that simply would not have existed and allows people to express themselves in ways they couldn’t without AI. (...) Generative AI has the power to democratize speech and content creation, much like the internet has.

I'd wager 99.9% of the art and content created by AI could go straight to the trashcan and nobody would miss it. Comparing AI to the internet is like comparing writing to doing drugs.

[–] Even_Adder@lemmy.dbzer0.com -2 points 1 week ago (6 children)

Ok, but is training an AI so it can plagiarize, often verbatim or with extreme visual accuracy, fair use? I see the 2 first articles argue that it is, but they don’t mention the many cases where the crawlers and scrappers ignored rules set up to tell them to piss off. That would certainly invalidate several cases of fair use

You can plagiarize with a computer with copy & paste too. That doesn't change the fact that computers have legitimate non-infringing use cases.

Instead of charging for everything they scrap, law should force them to release all their data and training sets for free.

I agree

I’d wager 99.9% of the art and content created by AI could go straight to the trashcan and nobody would miss it. Comparing AI to the internet is like comparing writing to doing drugs.

But 99.9% of the internet is stuff that no one would miss. Things don't have to have value to you to be worth having around. That trash could serve as inspiration for your 0.1% of people or garner feedback for people to improve.

load more comments (6 replies)
[–] tane6@lemm.ee 1 points 1 week ago

The fact that this is upvoted is so funny but unsurprising given the types who visit this site

load more comments (4 replies)
[–] technomad@slrpnk.net 85 points 1 week ago

Good, then it should die

[–] ProfessorScience@lemmy.world 61 points 1 week ago (1 children)

If I ran the zoo, then any AI that trained on intellectual property as if it were public domain would automatically become public domain itself.

[–] db0@lemmy.dbzer0.com 27 points 1 week ago

That's the only correct take

[–] mustbe3to20signs@feddit.org 55 points 1 week ago (3 children)

The audacity... If our technology isn't allowed to break the law, it will fail. Therefore we should change the law.

[–] rottingleaf@lemmy.world 9 points 1 week ago

The issue is that they want to change the law only for themselves. Distributing a partially reverse-engineered, cleansed from evil, modded and made good version of Windows NT that would give us the feeling of W2K and compatibility with Windows device drivers, for example, they don't want to make legal.

Generally yes, laws are subject to common sense and are changed when common sense dictates so.

[–] kipo@lemm.ee 7 points 1 week ago (1 children)

I think the law should be changed; copyright law is kind of a mess, but I don't know how to make it better. It would also need to be changed in a way that's fair, which these companies absolutely do not want; fairness would mean the end of laws like the DMCA.

[–] mustbe3to20signs@feddit.org 4 points 1 week ago

The current system is far from perfect and needs an update but any change in their spirit would make things worse.

load more comments (1 replies)
[–] pastermil@sh.itjust.works 50 points 1 week ago (2 children)
[–] Treczoks@lemmy.world 27 points 1 week ago

Indeed. Simply that. If a business is not sustainable without breaking the law, it is not a business, it's a criminal organisation.

load more comments (1 replies)
[–] GreenKnight23@lemmy.world 37 points 1 week ago

if something so simple can kill an entire industry, that industry should not exist.

[–] Brotha_Jaufrey@lemmy.world 37 points 1 week ago (3 children)

Good, I think it should be killed.

load more comments (3 replies)
[–] nickwitha_k@lemmy.sdf.org 30 points 1 week ago

Pure entitlement mindset.

If your business is not able to stay afloat while providing fair compensation to those whose labor is used, whether employee, co-owner, or third-party, you are not entitled to keep running it. Society doesn't have a duty to prop up wealthy thieves.

[–] palordrolap@fedia.io 28 points 1 week ago (1 children)

Using the same logic, it is "implausible" that we would not take money from those who have it and give it to the sheer volume of people who need it.

Oh. Suddenly it doesn't work that way. Huh. Funny how that is.

[–] pelya@lemmy.world 2 points 1 week ago

It depends on how rich you are. CEOs have their own, reduced edition of the law.

[–] Stern@lemmy.world 28 points 1 week ago

Bank robbers say laws against bank robbery will kill bank robbery.

[–] DJDarren@sopuli.xyz 25 points 1 week ago

Fuck Nick Clegg. Fuck that guy into the fucking sun.

Back in 2010 he managed to surf a wave of genuine optimism from young British voters who wanted something less shit, and found himself in a position where he could have brought about some genuine change for the better.

Instead that cunt hitched his wagon to the fucking Tories, who straight away announced an increase to university tuition fees. And who then went on to spend 15 years raping and pillaging the country like only fucking Tories can.

So yeah, fuck Nick Clegg.

[–] misterdoctor@lemmy.world 24 points 1 week ago
[–] DarkDarkHouse@lemmy.sdf.org 17 points 1 week ago

AI is not just limited to these overhyped plagiarism machines. Will consent laws kill vision systems? Will they kill classifiers? Will they kill gradient descent? No, they won't.

[–] frezik@midwest.social 16 points 1 week ago

There's a thread of thought that pops up in pro-AI posters from time to time: technology can't go backwards. The implication being that the current state of AI can only improve, and is here to stay.

This is wrong. Companies are spending multitudes of piles of cash to make AI work, and they could easily take their ball and go home. Extending copyright over the training data would likely trigger that, by the industry's own admission.

No, self-hosted models are not going to change this. A bunch of people running around with their own little agents aren't going to sustain a mass market phenomenon. You're not going to have integration in Windows or VisualStudio or the top of Google search results. You're not going to have people posting many pics on Facebook of Godzilla doing silly things.

The tech can go backwards, and we're likely to see it.

[–] yesman@lemmy.world 16 points 1 week ago

Also Clegg

asking women for permission would ruin my sexlife.

probably.

[–] surph_ninja@lemmy.world 13 points 1 week ago* (last edited 1 week ago) (1 children)

I’m not a fan of intellectual property law. I’m down to abandon it, once we establish an artist stipend to pay a regular salary for artists to live a life of dignity.

Maybe introduce a tax on AI to pay for it.

[–] mPony@kbin.earth 5 points 1 week ago

yeah a 110% tax

[–] MonkderVierte@lemmy.ml 12 points 1 week ago (4 children)

How did the media industey call pirates? Parasites?

load more comments (4 replies)
[–] ryper@lemmy.ca 12 points 1 week ago* (last edited 1 week ago)

Well the AI companies and investors should have understood that building an industry off of doing something questionable was risky and risks don't always work out.

[–] ZombieMantis@lemmy.world 12 points 1 week ago
[–] HakunaHafada@lemm.ee 9 points 1 week ago
[–] nyan@lemmy.cafe 8 points 1 week ago

[tardigrade_violin.jpg]

[–] dependencyinjection@discuss.tchncs.de 7 points 1 week ago (1 children)

It’s implausible that I would pay for Netflix, Disney+, Paramount+, etc.

When I can just build a server and buy a VPN connection.

[–] IllNess@infosec.pub 4 points 1 week ago (2 children)

Makes sense. Paying for all those services would kill my ability to support other industries. Fair game.

load more comments (2 replies)
[–] flandish@lemmy.world 7 points 1 week ago

file under: “unsurprising capitalist takes”

Come on Disney! Use your god tier copyright lawyers and stop this AI shit for good.

[–] rottingleaf@lemmy.world 4 points 1 week ago

Yes please.

[–] hamsterkill@lemmy.sdf.org 1 points 1 week ago

Terminator 7: Robot Pirates

load more comments
view more: ‹ prev next ›