this post was submitted on 28 Jul 2023
436 points (93.8% liked)
Technology
59223 readers
3421 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Probably because LLMs threaten to (and has already started to) shittify a truly incredible number of things like journalism, customer service, books, scriptwriting etc all in the name of increased profits for a tiny few.
again, the issue isn't the technology, but the system that forces every technological development into functioning "in the name of increased profits for a tiny few."
that has been an issue for the fifty years prior to LLMs, and will continue to be the main issue after.
removing LLMs or other AI will not fix the issue. why is it constantly framed as if it would?
we should be demanding the system adjust for the productivity increases we've already seen, as well to what we expect in the near future. the system should make every advancement a boon for the general populace, not the obscenely wealthy few.
even the fears of propaganda. the wealthy can already afford to manipulate public discourse beyond the general public's ability to keep up. the bigger issue is in plain sight, but is still being largely ignored for the slant that "AI is the problem."
Yep, the problem was never LLMs, but billionaires and the rich. The problems have always been the rich for thousands of years, and yet they are immensely successful at deflecting their attacks to other groups for those thousands of years. They will claim it's Chinese immigrants, or blacks, or Mexicans, or gays, or trans people. Now LLMs and AI are the new boogieman.
We should be talking about UBI, not LLMs.
It’s a capitalism problem not an AI or copyright problem.
Sure but lets say you try to solve this problem. What's the first thing you think a coordinated group could do, get sensible regulations about AI, or overthrow global capitalism. Its framed the way it is because unless you want ro revolt that's the framework we're gonna have to use to deal with it. I suppose we could alwyas do nothing to AI specifically and focus on just overthrowing capitalism, but during that time lots of harm will come to lots of workers because of AI use. I dont think anticapitalism has reached a critical mass (we need this for any real sustem wide attacks on and alternatives to capitalism) so I think dealing with this AI problem and trying to let everyone else know about how it's really a capitalism thing would do more to build support and avert harm to workers. I hate that its like that too but those choices are basically the real options we have moving forward from my pov.
You tell me what "sensible regulations about AI" are that don't hurt small artists and creators more than they centralize the major players and enrich copyright hoarding, copyright-maximalist corporations. (Seriously, this isn’t bait. I’ve been wracking my mind on the issue for months. Because the only serious proposals so far are expanding the already far-too-broad copyright rights to things like covering training or granting artists more rights to their work during their lifetime - something that will only hurt small artists) We desperately need more fair use, not less. The only "sensible regulations" that we should and could be talking about is some form of UBI. That's it.
UBI is a bandaid that doesn't solve the core issues of production under capitalism, the people with capital still control production, still make more money than eveyone else and still have more money and power to use influencing the politicians that write the laws surrounding UBI. And expecting me to solve the AI problem in a comment section is like me asking you to implement UBI in a way that landlords dont just jack up rent or business dont inflate prices with more cash and demand floating around, also whats your plan for when the level of UBI legislated , or planned increases in UBI is no longer sufficient enough to pay for housing food and other necessities? What do you do to counter the fact that the capitists still have more access to politicians and media empires they can use to discredit and remove UBI?
UBI is a bandaid, sure. But bandaids actually help; “sensible AI regulations” - a nothing phrase that will most likely materialize as yet another expansion of copyright — will actively make things worse. UBI is achievable, and can be expanded on once it’s enacted. You establish protections and regulations that actually help people, and dare opposition to ever try to take them away; instead of carrying water for copyright maximalists along the way.
Exactly. We need to break apart copyright with a crowbar. It's a broken system that only benefits the rich, and AI has the opportunity to turn the entire system into a pile of unenforceable garbage.
Why does legislation or regulation surrounding AI necessarily have to be copyright maxamilism but UBI regulations are somehow in some undescribed way going to be strong enough to prevent lobbying from the people who still control the mean of production? You're arguement gets to use the magic regulations that don't get challenged or changed, but my arguement is stuck to the one mainstream idea that has people worried?
Because those are the only “sensible AI regulations” seriously being talked about. Tell me any other actual regulatory schemes that are being proposed that aren’t, and I’ll be happy to talk about those, and likely support them. I’m not getting the hostility, btw. fwiw this (getting stronger consumer protection laws passed) is literally my job; I’m going to go out on a limb here and say we probably agree with more than we disagree, based on your comment history. Obviously UBI won’t be enough to - will never be enough to - oust capitalists from having an outsized influence in policy, but what I don’t support at all are regulations that would further centralize the corporate IP holders and tech companies that would actually benefit from the copyright maximalist proposals currently being bandied about by the fear mongering anti generative AI discourse.
Fundamentally we’re not going to copyright our way out of the externalities AI brings with it.
My arguement is not limited to the only regulations being currently talked about any more than your arguement is limited by what types of UBI are currently talked about because im not hearing any talk on UBI.
Friend you haven’t proposed ANY thing except “sensible” regulations. I’ve asked you to elaborate on what those might be, but I’m cautioning against the regulatory schemes currently being proposed and considered. Maybe I’m too in the weeds on this issue — again, literally my job — but I can tell you the only proposals actually being discussed right now are copyright maximalist.
Because the western world has no other framework for intellectual property other than copyright claims, and the people who write the laws and discuss the solutions are the people already at the top who have property to protect. If copyright in it's entirety disappeared tomorrow with nothing in it's stead the very first people to lose and lose the most will be the small artists, because there's nothing to stop people or companies with massive resource pools just steamrolling over them, in fact I can't see that move(eliminating all copyright) doing anything but accelerating a monopoly forming to whichever company can steal and distribute art the best. My best guess says if you try to solve that problem you'll find solutions that will at least alleviate the AI problem for smaller creators. But again I don't know what you're expecting of me, If I had the regulation all figured out, I probably wouldn't just be jawing about it on Lemmy, and you yourself said you cannot figure it out and it's related to your field. Because right now like you said, more copyright will slowly strangle creators, and like I just said no copyright will let them be steamrolled over by anyone who can sell or distribute it for a lower price than them. UBI make sure the artist doesn't die. Great honestly, better than just letting people drop, but if all the artists work can simply be stolen for profit by those that didn't make it, and they have to subsist on the payments everyone gets, then there's really no such thing as an artist in terms of vocation anymore, nor would there be enough opportunities to profit from your own artistic work. I don't know the solution any more than you do, but to me I don't think UBI is good enough because I believe being an artist is labor worth being paid for. I think we need something targeted at the AI problem, not something reactionary, and not something necessarily targeting the AI itself, perhaps targeting the relationship between people who create art, people who distribute it, and people who profit from it, since often times those aren't all the same person.
You’re making a lot of jumps here. Copyright need not be eliminated entirely — that won’t happen anytime soon. But fair use exists for a reason, and generative art is if anything a fair use machine more than a plagiarism one. Everything is a remix, and artists making art using AI tools deserve to be able to monetize that work. I, too, believe artists deserve to get paid. I’ve worked personally helping small artists and creators navigate those waters, writing license agreements and helping them argue fair use to create and monetize their work. Expanding copyright terms further than they are wont help those artists get paid. It will further centralize the major corporations and big tech. It’s a siren song, a distraction from the real issue: that AI represents a far bigger threat than JUST to artists and writers. There is major economic upheaval right around the corner. There will be massive job displacement. Rather than further enriching corporations for misplaced short term gain, we must move to consider how we can allow artists and creators and everyone else with a creative spark to continue to exist and create art without further building up these legacy structures — that are already fraying at the seams and not fulfilling their original purpose, having long since been co-opted by the disneys of the world. We need more fair use, not less. UBI of some form is that answer, as far as I can tell. We need to start providing people the freedom to work on what they want without the crushing pressures of imminent destitution if we hope to have artists making art. I say again: copyright will fundamentally not be the solution to the externalities of AI. We need more creative, more radical solutions. https://abovethelaw.com/legal-innovation-center/2023/07/21/stop-rushing-to-copyright-as-a-tool-to-solve-the-problems-of-ai/
I still have yet to argue in favor of more copyright, I don't know why you're still arguing against something we both agreed was bad, but this is the first time in a while I've heard UBI called radical in the face of capitalism, I don't think it gets at the root of the problem of capitalism or of the smaller AI vs artist problem as I've explained previously. The relations of production and profit don't budge an inch. I've also never said I oppose UBI though, in fact I said if it helps keep people fed and sheltered that's great. I just see it as a stopgap 'solution' that will buy time and comfort until the market adapts on how to recuperate that money since it can expect every citizen to get that every month, I don't trust a free market not to ruin that. Furthermore specifically in the face of AI versus artists, the consequences are already happening and the push at a political level for UBI is lower than when Andrew Yang was running, and since the political level is where UBI would have to come from, I don't like the idea of not trying anything else until then.
Andrew Yang is a neoliberal hack, and I promise you UBI is about as radical as we could hope for that’s still attainable in the currently policy landscape. Anything more radical and we’re talking actual get-out-your-guillotine-revolution. Fair enough, though, I would suggest that the way you talk about addressing AI as it relates to artists dances around copyrightland more than directly and broadly focusing on labor — which imo is the best way to tackle AI externalities. What we should be doing is building bridges between artists and the rest of labor, finding ways out of the cyberpunk dystopia not by relying on intellectual property but by building stronger social programs that lift us all up.
Simple: Make UBI is inflation adjusted. Problem solved.
There's no such thing as "sensible regulations" for AI. AI is a technological advantage. Any time you regulate that advantage, other groups that don't have those regulations will fuck you over. Even if you start talking about regulations, the corpos will take over and fuck you over with regulations that only hurt the little guy.
Hell, even without regulations, we're already seeing this on the open-source vs. capitalism front. Google admitted that it lost some advantages because of open-source AI tools, and now these fucking cunts are trying to hold on to their technology as close as possible. This is technology that needs to be free and open-source, and we're going to see a fierce battle with multi-billion-dollar capitalistic corporations clawing back whatever technological gains OSS acquired, until you're forced to spend hundreds or thousands of dollars to use a goddamn chess bot.
GPLv3 is key here, and we need to force these fuckers into permanent copyleft licenses that they can't revoke. OpenAI is not open, StabilityAI is not the future, and Google is not your friend.
Isnt forcing a copyleft licence exactly a regulation that would be sensible though? So why wouldn't regulations and legislation work if thats your solution too?
There's never been a bill that had the word "copyleft" or "GNU Public License" on it at all, and thanks to corpo lobbyists, there probably never will be. We have to be realistic here, and the only realistic option is to encourage as much protected open-source software on the subject as possible.
This isn’t a technological issue, it’s a human one
I totally agree with everything you said, and I know that it will never ever happen. Power is used to get more power. Those in power will never give it up, only seek more. They intentionally frame the narrative to make the more ignorant among us believe that the tech is the issue rather than the people that own the tech.
The only way out of this loop is for the working class to rise up and murder these cunts en masse
Viva la revolucion!
I completely agree with you, ai should be seen as a great thing, but we all know that the society we live in will not pass those benefits to the average person, in fact it'll probably be used to make life worse. From a leftist perspective it's very easy to see this, but from the Norman position, atleast in the US, people aren't thinking about how our society slants ai towards being evil and scary, they just think ai is evil and scary. Again I completely agree with what you've said it's just important to remember how reactionary the average person is.
It is a completely understandable stance in the face of the economic model, though. Your argument could be fitted to explain why firearms shouldn’t be regulated at all. It isn’t the technology, so we should allow the sale of actual machine guns (outside of weird loopholes) and grenade launchers.
The reality is that the technology is targeted by the people affected by it because we are hopeless in changing the broader system which exists to serve a handful of parasitic non-working vampires at the top of our societies.
Edit: not to suggest that I’m against AI and LLM. I want my fully automated luxury communism and I want it now. However, I get why people are turning against this stuff. They’ve been fucked six ways from Sunday and they know how this is going to end for them.
Plus, a huge amount of AI doomerism is being pushed by the entrenched monied AI players, like OpenAI and Meta, in order to used a captured government to regulate potential competition out of existence.
Exactly. I work in AI (although not the LLM kind, just applying smaller computer vision models), and my belief is that AI can be a great liberator for humanity if we have the right political and economic apparatus. The question is what that apparatus is. Some will say it's an inherent feature of capitalism, but that's not terribly specific, nor does it explain the relatively high wealth equality that existed briefly during the middle of the 20th century in America. I think some historical context is important here.
Historical Precedent
During the Industrial Revolution, we had an unprecedented growth in average labor productivity due to automation. From a naïve perspective, we might expect increasing labor productivity to result in improved quality of life and less working hours. I.e., the spoils of that productivity being felt by all.
But what we saw instead was the workers lived in squalor and abject poverty, while the mega-rich captured those productivity gains and became stupidly wealthy.
Many people at the time took note of this and sought to answer this question: why, in an era over greater-than-ever labor productivity, is there still so much poverty? Clearly all that extra wealth is going somewhere, and if it's not going to the working class, then it's evidently going to the top.
One economist and philosopher, Henry George, wrote a book exploring this very question, Progress and Poverty. His answer, in short, was rent-seeking:
Rent-seeking takes many forms. To list a few examples:
George's argument, essentially, was that the privatization of the economic rents borne of god-given things — be it land, minerals, or ideas — allowed the rich and powerful to extract all that new wealth and funnel it into their own portfolios. George was not the only one to blame these factors as the primary drivers of sky-high inequality; Nobel-prize winning economist Joseph Stiglitz has stated:
George's proposed remedies were a series of taxes and reforms to return the economic rents of those god-given things to society at large. These include:
such as in the Norwegian model:
(continued)
Present Day
Okay, so that's enough about the past. What about now?
Well, monopolization of land and housing via the housing crisis has done tremendous harm:
And that is just one form of rent-seeking. Imagine the collective toll of externalities (e.g., the climate crisis), monopolistic/oligopolistic markets such as energy and communications, monopolization of valuable intellectual property, etc.
So I would tend to say that — unless we change our policies to eliminate the housing crisis, properly price in externalities, eliminate monopolies, encourage the growth of free and open IP (e.g., free and open-source software, open research, etc.), and provide critical public goods/services such as healthcare and education and public transit — we are on a trajectory for AI to be Gilded Age 2: Electric Boogaloo. AI merely represents yet another source of productivity growth, and its economic spoils will continue to be captured by the already-wealthy.
I say this as someone who works as an AI and machine learning research engineer: AI alone will not fix our problems; it must be paired with major policy reform so that the economic spoils of progress are felt by all, not just the rich.
Joseph Stiglitz, in the same essay I referred to earlier, has this to say:
Technology is but a tool. It cannot tell you how to use it. If it's in the hands of a writer it's a helpful sounding board. If it's in the hands of a Netflix producer it's an anti-labor tool. We need to protect people's livelyhoods
Journalism and customer service can't possibly get worse than they already are.
Books and movies are not at risk - there will always be lots of people willing to write good content for both, and the best content will be published. And "the best" will be a hybrid of humans and AI working together - which is what has some people in that industry so scared. Just like factory workers were scared when machines entered that industry.
It's an irrational fear - there are still factory workers today. Probably more than ever. And there will still be human writers - it's an industry that will never go away.
If, however, you refuse to work with AI... then yeah, you're fucked. Pretty soon you'll be unemployable and nobody will publish your work, which is why the movie publishers aren't going to budge. They recognise a day is coming where they can't sell movies and tv shows that were made exclusively by humans and they are never going to sign a contract locking them into a dead and path.
You make it sound like the quality is what will increase with human/AI partnership. What will realistically happen is an expected rate of output. Why can't you deliver a book every year with an AI ghost writing? People who work slowly or meticulously will be phased out by those that can quickly throw together a collage of their own words and their guided filler. It's amazing and futuristic. It can be very useful and inspirational. But I do not share your optimism that it will make creative industries better. It will allow a single person to put together a script that would've taken a team... But the better content will now be drowning in this sea. Unfortunately, I expect an equivalent of the media explosion that happened when the reality tv format became an ok thing, eventually leading to shows half filmed on phones. The end result will be double the marvel movies every year.