this post was submitted on 16 Feb 2024
59 points (91.5% liked)

Men's Liberation

1879 readers
1 users here now

This community is first and foremost a feminist community for men and masc people, but it is also a place to talk about men’s issues with a particular focus on intersectionality.


Rules

Everybody is welcome, but this is primarily a space for men and masc people


Non-masculine perspectives are incredibly important in making sure that the lived experiences of others are present in discussions on masculinity, but please remember that this is a space to discuss issues pertaining to men and masc individuals. Be kind, open-minded, and take care that you aren't talking over men expressing their own lived experiences.



Be productive


Be proactive in forming a productive discussion. Constructive criticism of our community is fine, but if you mainly criticize feminism or other people's efforts to solve gender issues, your post/comment will be removed.

Keep the following guidelines in mind when posting:

  • Build upon the OP
  • Discuss concepts rather than semantics
  • No low effort comments
  • No personal attacks


Assume good faith


Do not call other submitters' personal experiences into question.



No bigotry


Slurs, hate speech, and negative stereotyping towards marginalized groups will not be tolerated.



No brigading


Do not participate if you have been linked to this discussion from elsewhere. Similarly, links to elsewhere on the threadiverse must promote constructive discussion of men’s issues.



Recommended Reading

Related Communities

!feminism@beehaw.org
!askmen@lemmy.world
!mensmentalhealth@lemmy.world


founded 2 years ago
MODERATORS
all 25 comments
sorted by: hot top controversial new old
[–] Ashelyn@lemmy.blahaj.zone 42 points 10 months ago* (last edited 10 months ago) (4 children)

My boyfriend occasionally watches YouTube shorts, mostly for the occasional good joke or cat video. He's told me that the shorts algorithm seemingly goes out of its way to show him Andrew Tate type content as well as general Daily Wire/Shapiro/conservative 'libs owned' clips. More or less, if he doesn't immediately close out the app or swipe to the next short when one of these videos comes up, his shorts feed is quickly dominated by them.

I think the big thing is that these algorithms are often trained on maximizing watch time/app usage, and there's something uniquely attention-catching to a lot of men and boys about the way viral manosphere content is constructed. A random poor setup to a skit is likely to get swiped past, but if the next clip comes swinging out of the gate with "here's how women are destroying the West" there's a certain morbid curiosity that gets some to watch the whole thing (even out of amusement/credulousness), or at least stay on the clip slightly longer than they would otherwise. If one lingers on that content to any degree, the algorithm sees that as a sign that the user wants more of it—or rather, that it would achieve its "more engagement" goals by serving up more of it.

Plus, it's grabbing ideas on what to recommend based on user data and clustered associations. It's very likely to test the waters with stuff it knows worked for others with similar profiles, even if it's a bit of a reach.

Edit: minor sentence structure stuff

[–] azertyfun@sh.itjust.works 22 points 10 months ago (2 children)

YouTube is really bad with this. My algorithm's "vibes" is leftist/queer video essays, urbanist content, and bestofs/clips from video game streamers.

Yet YouTube unironically and regularly puts Andrew Tate's ugly-ass mug next to Alexander Avilla and I'm just like what the actual fuck. Of the young male population I'm probably in the top 1 % least likely to click on that shit.

Hanlon's razor says this is just the shorts algorithm being insanely stupid (which it also is in other ways). But this is beyond a simple mishap, it is an incredibly dangerous and damaging ideology being (accidentally or not) sanctioned and pushed by YouTube onto impressionable youths. How is no-one freaking out about this?? (It's a rhethortical question. The answer is it serves the interests of the patriarchy).

[–] Kichae@lemmy.ca 8 points 10 months ago (1 children)

It's not the algorithm being stupid. It's the algorithm only personalizing a limited number of things. The optimal solution is to yet people stuck on things that are proven sticky, so after a few videos, they basically always start offering up bullshit that aggros people.

It just happens faster on short, because videos are like a minute long, not 15.

[–] bouh@lemmy.world 1 points 10 months ago

I am the same. At this point I hardly believe it's a mishap. IMO the fascists are manipulating the algorithms, or they pay to promote them, or the platform favors them.

I can see a propaganda structure working to manipulate the algorithm or pay to get them better ranked. But I can also see the platforms, Facebook, Google, etc, to favor them intently.

I'd argue that a lot of content on YouTube is much more interesting and clic worthy that the mysoginistic crap they force feed us with.

[–] Kit@lemmy.blahaj.zone 8 points 10 months ago (1 children)

I use YouTube Shorts quite a bit and have never seen that kind of content. It's all gaming and cooking stuff for me, which matches my search history.

[–] Ashelyn@lemmy.blahaj.zone 9 points 10 months ago (1 children)

I'm inclined to believe it's one of those things that, once it enters the recommendation sphere on your account, it's really hard to get it to go away without manually removing profiling info via Google account settings. I just remembered now, at one point he did run a personal experiment to try and see how extreme the content would get if he let it play after it started showing up. After getting kind of disturbed with how bad it got, and bored of laughing at it for amusement, he tried training the algorithm to only show him cat videos and kind of settled on where it currently is. I'm guessing his account has a lot of those older associations still tied, and the algorithm tries to rekindle them from time to time.

[–] ChristianWS 5 points 10 months ago (1 children)

YouTube has a setting somewhere to only hold your view history for 3 months. I can't recommend that enough.

Been using that setting for a couple of years, and it changed a bit how the algorithm recommends things to me cause if I somehow get into the cesspool sphere, it will stop in about 3 months.

[–] Rodeo@lemmy.ca 2 points 10 months ago (1 children)

I can't imagine being so beholden to a for profit company that I spend three whole months putting up with shit I don't want to see in the hopes that eventually it will get better.

It's just crazy to me how important YouTube is to some people.

[–] ChristianWS 2 points 10 months ago (1 children)

Meh, can't really disagree with that.

I tried using NewPipe for a while on mobile, but the issue is that surprisingly, no algorithm is almost as maddening as a algorithm that went batshit insane.

[–] Rodeo@lemmy.ca 2 points 10 months ago (1 children)

I love newpipe for exactly that reason. I don't like being suggested videos, I want to search for something specific and watch those videos about the specific knowledge I'm seeking at the moment. If I want to waste time then I take a moment to think about something I was wondering earlier that day, and look that up.

If I'm going waste time I might as well learn something about whatever.

[–] ChristianWS 1 points 10 months ago

I like being suggested stuff based on my subscriptions, otherwise I would not find anything worth to watch. This also helps when someone you are subscribed to releases a new video, but for some reason or another, you didn't notice it.

There's also the issue of not having an alternative on desktop, so you are going to use YouTube anyway

[–] Donkter@lemmy.world 7 points 10 months ago

Using YouTube shorts myself I think the reason is that the algorithm is a bit janky. I notice that something it's fond of is to continue pushing me these clips from podcasts hosted by random comedians (I'm sure you know the type of content I'm talking about.) probably because lots of that type of content gets generated every week. Sometimes it's funny, usually it's not, but it's not something I swipe away from immediately. I think that these types of videos are pretty strongly linked to the right wing talkshows that follow similar formats. The algorithm just assumes it's all similar content i.e. white men talking. I guess it's just a speed run of the famous "alt-right pipeline".

[–] Lumelore@lemmy.blahaj.zone 2 points 10 months ago (2 children)

I tried youtube shorts a few times and it kept recommending me very right wing content including andrew tate. I always swipe away as fast as I possibly can and it still is very insistent on showing me that type of content. Youtube even knows that I am a woman and I don't watch anything even remotely similar to what it wants to show me so I don't really understand why it is so insistent on showing me misogynistic content. I guess they just don't want me to use their platform anymore?

[–] Ashelyn@lemmy.blahaj.zone 4 points 10 months ago

I'm pretty certain the shorts algorithm is kind of "its own thing" in a lot of ways. It's a prime "your mileage may vary" system, and because so many right wing creators upload to it, it's basically a numbers game unless you get lucky with the algorithm when it's first getting a handle on your preferences.

While I don't know this for certain, the only really effective way to get the algorithm to stop showing you something is to literally close the app for a while when it puts one in front of you. Combined with searching up shorts for the stuff you want, I think it's possible but it's really persistent if it thinks it should show you specific kinds of content.

At the end of the day, however, they're machine learning models, and while we can gesture at trends, nobody knows the full ins and outs of how a specific model makes its decisions. Kind of scary that we trust them to the role of curation in the current environment at all to be honest

[–] fidodo@lemmy.world 1 points 10 months ago (1 children)

I just checked shorts on my phone and they have like and dislike buttons. Does it still show you that stuff if you press dislike? In the top right they have a ... with options for "not interested" and "block channel" which should fully remove it if it's the same channel being recommended. I don't have any of that stuff on my feed probably because I blocked them the first time I saw them. If you didn't know those options existed then they clearly have a UI problem.

[–] Lumelore@lemmy.blahaj.zone 1 points 10 months ago (1 children)

If you don't like a video you should just swipe away because the algorithm is looking for engagement and even though disliking it is negative it is still engagement and it will cause YouTube to show that content to more people. It can use your dislike to determine if that video is good or not for people similar to you, so your dislike might change who it is suggested to, but it won't cause it to be suggested less. At least that is what I have heard, I don't know if that is actually true.

[–] fidodo@lemmy.world 1 points 10 months ago (1 children)

Of course they will try and figure out what demographics like which videos, that's not always a bad thing, some people are not interested in video game content, that doesn't mean it's bad, it's just a different demographic. They're going to use whatever signals they can find to figure out those clusters, if you don't engage but another demographic engages heavily they'll keep recommending it to them. Giving them less signal won't make it less popular, it will just give you worse recommendations.

[–] Lumelore@lemmy.blahaj.zone 1 points 10 months ago (1 children)

If the goal is to prevent this harmful content from being recommended at all then I think it is important to just ignore it. I don't need to use youtube shorts and if it's going to be terrible then I'm just never going to interact with it anymore.

[–] fidodo@lemmy.world 1 points 10 months ago

I don't think that will achieve the goal.

[–] gapbetweenus@feddit.de 7 points 10 months ago

On one hand very nice short film. On the other hand an internet provider demanding safer internet for children, feels inherently sketchy to me.

[–] JoYo@lemmy.ml 6 points 10 months ago

my feed is just cute animals and dance choreo.

[–] stevedidwhat_infosec@infosec.pub 5 points 10 months ago (1 children)

Some platforms are worse than others. YouTube shorts, looking at you, you fucking hypocrites.

[–] JoYo@lemmy.ml 1 points 10 months ago

yt music samples are a much better short video feed.

[–] fidodo@lemmy.world 0 points 10 months ago

I need to know, do you people not use either the thumbs up, thumbs down, not interested, or block channel buttons?