this post was submitted on 17 May 2024
88 points (100.0% liked)

Technology

34870 readers
78 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
 

The Commission is concerned that the systems of both Facebook and Instagram, including their algorithms, may stimulate behavioural addictions in children, as well as create so-called 'rabbit-hole effects'. In addition, the Commission is also concerned about age-assurance and verification methods put in place by Meta.

Today's opening of proceedings is based on a preliminary analysis of the risk assessment report sent by Meta in September 2023, Meta's replies to the Commission's formal requests for information (on the protection of minors and the methodology of the risk assessment), publicly available reports as well as the Commission's own analysis.

The current proceedings address the following areas:

  • Meta's compliance with DSA obligations on assessment and mitigation of risks caused by the design of Facebook's and Instagram's online interfaces, which may exploit the weaknesses and inexperience of minors and cause addictive behaviour, and/or reinforce so-called ‘rabbit hole' effect. Such an assessment is required to counter potential risks for the exercise of the fundamental right to the physical and mental well-being of children as well as to the respect of their rights.
  • Meta's compliance with DSA requirements in relation to the mitigation measures to prevent access by minors to inappropriate content, notably age-verification tools used by Meta, which may not be reasonable, proportionate and effective.
  • Meta's compliance with DSA obligations to put in place appropriate and proportionate measures to ensure a high level of privacy, safety and security for minors, particularly with regard to default privacy settings for minors as part of the design and functioning of their recommender systems.
you are viewing a single comment's thread
view the rest of the comments
[–] Lemongrab@lemmy.one 5 points 6 months ago (1 children)

The EU is not a hero. It does better than the bare minimum and look great when compared to the US and other fascist countries. The EU keep talking about banning encrypted messaging and installing backdoors into encryption. They also aren't uniform in their application of privacy laws. Not that US companied aren't terrible (they are) but so are all companies. When money is the goal, all that results is the insatiable hunger to leech every possible source of capital and destroy our inalienable rights to reach their impossible goals.

[–] Lemongrab@lemmy.one 2 points 6 months ago (1 children)

I hope it isn't just a fine. A fine is just the cost of doing business. We can't let them get away with using capital to do crimes.

[–] themurphy@lemmy.ml 1 points 6 months ago (1 children)

Did you forget to change profiles when answering yourself?

[–] Lemongrab@lemmy.one 1 points 5 months ago* (last edited 5 months ago)

No, I was just adding some stuff and was too lazy to edit my comment