this post was submitted on 22 Dec 2023
851 points (96.4% liked)

Technology

59596 readers
3069 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

More than 200 Substack authors asked the platform to explain why it’s “platforming and monetizing Nazis,” and now they have an answer straight from co-founder Hamish McKenzie:

I just want to make it clear that we don’t like Nazis either—we wish no-one held those views. But some people do hold those and other extreme views. Given that, we don’t think that censorship (including through demonetizing publications) makes the problem go away—in fact, it makes it worse.

While McKenzie offers no evidence to back these ideas, this tracks with the company’s previous stance on taking a hands-off approach to moderation. In April, Substack CEO Chris Best appeared on the Decoder podcast and refused to answer moderation questions. “We’re not going to get into specific ‘would you or won’t you’ content moderation questions” over the issue of overt racism being published on the platform, Best said. McKenzie followed up later with a similar statement to the one today, saying “we don’t like or condone bigotry in any form.”

you are viewing a single comment's thread
view the rest of the comments
[–] helenslunch@feddit.nl -3 points 11 months ago (1 children)

Kicking them off the platform just sends them to other echo chambers like False social where they just circle jerk each other all day unchallenged.

[–] sc_griffith@awful.systems 1 points 11 months ago (1 children)

wow that's terrible, that they'd circlejerk each other instead of having a mass audience to post propaganda to. I can't imagine a worse outcome

[–] helenslunch@feddit.nl 2 points 11 months ago (1 children)

You can't see how that might further radicalize a group of people susceptible to being easily manipulated?

[–] sc_griffith@awful.systems -1 points 11 months ago (1 children)

if you don't ban them that just happens on the mainstream platforms. big chunks of j6 were organized on twitter and facebook. qanon mostly spread off the chans, on mainstream platforms. giving extremists access to fence sitters isn't like throwing water on a fire, it's like throwing fuel on it

[–] helenslunch@feddit.nl 2 points 11 months ago (1 children)

if you don't ban them that just happens on the mainstream platforms

What difference does it make where it happens? At least on mainstream platforms they're easier to track and they are regularly challenged.

giving extremists access to fence sitters isn't like throwing water on a fire, it's like throwing fuel on it

Disagree.

[–] sc_griffith@awful.systems -1 points 11 months ago (1 children)

I cited obvious examples where extremist ideology got supercharged and organized through the wide reach mainstream platforms provide and you're like 'uh what difference does it make. disagree.' you are not a serious person

[–] helenslunch@feddit.nl 1 points 11 months ago* (last edited 11 months ago)

The fallacy you're making is that those movements wouldn't have been or are no longer "supercharged" on a different radical-only platform. You are an ingenuous person.