this post was submitted on 29 Aug 2023
785 points (96.9% liked)

Technology

60082 readers
4457 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
(page 3) 50 comments
sorted by: hot top controversial new old
[–] BeautifulMind@lemmy.world -2 points 1 year ago* (last edited 1 year ago) (8 children)

I'll agree that ISPs should not be in the business of policing speech, buuuut

I really think it's about time platforms and publishers be held responsible for content on their platforms, particularly if in their quest to monetize that content they promote antisocial outcomes like the promulgation of conspiracy theories and hate and straight-up crime

For example, Meta is not modding down outright advertising and sales of stolen credit cards at the moment Also meta selling information with which to target voters... to foreign entities

[–] CeeBee@lemmy.world 1 points 1 year ago (1 children)

The problem is that your definitions are incredibly vague.

What is a "platform" and what is a "host"?

A host, in the definition of technology, could mean a hosting company where you would "host" a website from. If it's a private website, how would the hosting company moderate that content?

And that's putting aside the legality and ethics of one private company policing not only another private company, but also one that's a client.

[–] BeautifulMind@lemmy.world 1 points 1 year ago

Fair point about hosts, I'm talking about platforms as if we held them to the standards we hold publishers to. Publishing is protected speech so long as it's not libelous or slanderous, and the only reason we don't hold social media platforms to that kind of standard is that they demanded (and received) complete unaccountability for what their users put on it. That seemed okay as a choice to let social media survive as a new form of online media, but the result is that for-profit social media, being the de facto public square, have all the influence they want over speech but have no responsibility to use that influence in ways that aren't corrosive to democracy or to the public interest.

Big social media already censor content they don't like, I'm not calling for censorship in an environment that has none. What I'm calling for is some sort of accountability to nudge them in the direction of maybe not looking the other way when offshore troll farms and botnets spread division and disinformation

load more comments (7 replies)
load more comments
view more: ‹ prev next ›