sc_griffith

joined 1 year ago
 

the symbolism is πŸ˜—πŸ‘Œ

archive link: https://archive.ph/jFO6f

 

archive link

https://archive.ph/n3Ffq

Judge Mehta’s Google decision is likely to be appealed. β€œRegardless of who wins or loses, this case probably has a date with the Supreme Court,” Mr. Kovacic said.

ah well

 

"subreddit rules. Speak pro-ai thoughts freely."

DefendingAIArt is a subreddit run by mod "Trippy-Worlds," who also runs the debate sister subreddit AIWars. Some poking around made clear that AIWars is perfectly fine with having overt Nazis around, for example a guy with heil hitler in his name who accuses others of lying because they are "spiritually jewish." So we're off to a great start.

the first thing that drew my eye was this post from a would be employer:

My hobby is making games. Every artist have spoken to regarding my current project has rejected currency in exchange for referencing Al-made images.

not really clear what the title means, but this person seems to have had a string of encounters with the most based artists of all time.

Has anyone experienced this? They see Al work and lose their mind, some even have the nads to expect to get a pay multiplier to 'compensate" for the "theft" like my surname is fucking Altman. Like, bro, I can barely afford your highly- accomplished and talented ass and would be doing it for myself if had your skillset, yet you reject my money with prejudice because pushed my shitty programmer art a bit further with a piece of software which can't even use to a fraction of its full potential? That's a greeeeeeeeaaa way to convince me to keep your artstation username out of my prompts to public models, even if believe that particular spirit of behavior should be illegal

also claims to have been called "racial and gender slurs" for using ai art and that he was "kicked out of 20 groups" and some other things. idk what to tell this guy, it legitimately does suck that wealthy people have the money to pay for lots of art and the rest of us don't

Could we Ban the "No Al" Symbol? Someone proposed an idea to me: why not gather evidence and present it to the authorities who prohibited the display of the Swastika and other hate symbols? I was impressed by this suggestion. After researching, I found out that there are organizations that can categorize it as illegal if we can show evidence of the harm it has caused. I believe we can unite people, including artists who have suffered due to false accusations by anti-Al rioters, to support this cause. If we all sign a petition, we can ban the symbol, which would prevent its misuse on platforms like DeviantArt and stop the spread of misinformation. Would you support this initiative? Would you sign to end ignorance and compel them to advocate for fair regulations for Al, ensuring that nobody has to encounter this symbol and that those who use it for malicious purposes find no refuge?Or is it just not possible? Let's discuss.

I really enjoyed browsing around this subreddit, and a big part of that was seeing how much the stigma around AI gets to people who want to use it. pouring contempt on this stuff is good for the world

the above guy would like to know what combination of buttons to press to counter the "that just sounds like stealing from artists" attack. a commenter leaps in to help and immediately impales himself:

'just block and move on' 'these are my real life friends' 'oh...'

you hate to see it. another commenter points out that well ... maybe these people just aren't your friends

'antis will always just stab you in the back'

to close out, an example of fearmongering:

So I made a post on a sub with a rule against Alart and the Auto-mod does this...I'm assuming its fearmongering right? automod: Your comments and posts are being sold by Reddit to Google to train Al. You cannot opt out.

 

This is an essay about 'village' vs 'control' techno-optimism I wrote for a class final in 2016. I was in undergrad etc etc but for 2016 I feel it had a lot of foresight, and there's still some bits here and there I haven't seen anyone else explicate. Thought some of you might be interested

[–] sc_griffith@awful.systems 2 points 8 months ago* (last edited 8 months ago)

I don't see any reason being trained on writing informed by correct knowledge would cause it to be correct frequently. unless you're expecting it to just verbatim lift sentences from training data

the site you are imagining, the supposed free speech site? it converges to gab. this dynamic is basic and I can't take you seriously if you don't get this.

  • nazis are encouraged to be equal voices on a platform
  • they use the platform's reach to radicalize fence sitters
  • other users, realizing their digital roommates are Nazis, are alarmed and leave
  • now it's a nazi site

what exactly do you think substack will consist of in two years if they don't do a 180? the entire reason we're having this conversation right now is that a bunch of substack writers said they would rather leave than hang out with nazis

[–] sc_griffith@awful.systems -1 points 1 year ago (1 children)

I cited obvious examples where extremist ideology got supercharged and organized through the wide reach mainstream platforms provide and you're like 'uh what difference does it make. disagree.' you are not a serious person

[–] sc_griffith@awful.systems 1 points 1 year ago (2 children)

nobody but nazis wants to be on those lol. go post on gab or whatever if you want that. it's free. you can do it. you just don't actually want to

[–] sc_griffith@awful.systems -1 points 1 year ago (3 children)

if you don't ban them that just happens on the mainstream platforms. big chunks of j6 were organized on twitter and facebook. qanon mostly spread off the chans, on mainstream platforms. giving extremists access to fence sitters isn't like throwing water on a fire, it's like throwing fuel on it

[–] sc_griffith@awful.systems 1 points 1 year ago* (last edited 1 year ago) (4 children)

by causal link, I mean how does banning nazis cause support groups for non-offending pedophiles to get banned. like how does that actually happen. please be as specific as you can be

[–] sc_griffith@awful.systems 2 points 1 year ago* (last edited 1 year ago)

fascinated that you think it would somehow be harder for you to go out and find nazis if substack weren't hosting and paying them. it will always be easy to find and read Nazi content. the reason substack matters is that the platform helps THEM find YOU, or a suggestible journalist, or a suggestible politician, etc. you are not the protagonist here

[–] sc_griffith@awful.systems 1 points 1 year ago (6 children)

sorry what exactly about banning nazis causes one to ban non-offender pedophile support groups. like what is the actual causal link you're suggesting? if you just mean "I noticed random people endorse this thing I have no opinion on, and also this similar sounding thing I think is bad," that's not super compelling

[–] sc_griffith@awful.systems 1 points 1 year ago (5 children)

wow that's terrible, that they'd circlejerk each other instead of having a mass audience to post propaganda to. I can't imagine a worse outcome

to be clear, I meant that the reasons you're wrong are discussed in the article. I did not mean that the content of the article is more correct than that of the headline - the headline and article are both are correct. I suggest you read the article

[–] sc_griffith@awful.systems 7 points 1 year ago (2 children)

this is discussed in the article

But one of those stoppages is for the purpose of improving the lives of working class people, and in particular involves humans who can communicate with first responders. It's constructive. Those people's children won't live in poverty

The other is a side effect of a shoddy product, one which only operates because it corruptly evaded regulatory consequences for its shoddiness. The stoppage was only intended in the sense that cutting corners is the reason the product is on the market; otherwise it serves no specific purpose

It's true that the robotaxi fuck up is bad and the protest is less bad or good, but fundamentally they're not even the same type of thing

view more: next β€Ί