this post was submitted on 10 Jul 2023
350 points (92.9% liked)

Asklemmy

43945 readers
623 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy πŸ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
 

Hi, I am a guy in early thirties with a wife and two kids and whenever I go on youtube it always suggests conservative things like guys dunking on women, ben shapiro reacting to some bullshit, joe rogan, all the works. I almost never allow it to go to that type of video and when I do it is either by accident or by curiosity. My interest are gaming, standup comedy, memes, react videos, metalurgy, machining, blacksmithing, with occasional songs and videos about funny bullshit and I am not from america and I consider myself pretty liberal if I had to put it into terms used in america. But european liberal, so by american standards a socialist. Why does it recommend this shit to me, is this some kind of vector for radicalization of guys in my category? Do you have similar experience?

top 50 comments
sorted by: hot top controversial new old
[–] josephsh98@lemmy.kde.social 69 points 1 year ago* (last edited 1 year ago) (4 children)

You most probably viewed these types of videos a few times and the algorithm started recommending them to you. It only takes a couple of videos for the algorithm to start recommending video of the same topic. You could easily solve this by clicking on the 3 dots next to the video and then selecting "Not interested", do it enough times and they'll be gone from your feed.

[–] Max_P@lemmy.max-p.me 54 points 1 year ago* (last edited 1 year ago) (1 children)

This, the algorithm doesn't care whether you ~~like~~ enjoy it or not, it cares whether you engage with it or not. Even dislikes are engagement.

[–] josephsh98@lemmy.kde.social 15 points 1 year ago (3 children)

I'm not talking about liking the video, I'm talking about viewing the video. Viewing the video, especially a large part of it, will contribute to the watch time, therefore affecting the algorithm's recommendations.

[–] Max_P@lemmy.max-p.me 11 points 1 year ago* (last edited 1 year ago) (1 children)

I meant like as in "enjoy" not as in click the thumbs up button, sorry for the confusion. Even a video that pisses you off, that you thumb it down along with every comment is considered engagement and it not only feeds you more but also boost the video as a whole.

All it cares is whether it keeps you on YouTube. It's like news, outrage is profitable even if everyone hates it.

load more comments (1 replies)
load more comments (2 replies)
[–] Viper_NZ@lemmy.nz 25 points 1 year ago (5 children)

I am constantly bombarded with Jordan Peterson videos despite disliking them and telling the algorithm to show less like this.

I’m not sure how it profiles people, but it sucks.

[–] can@sh.itjust.works 23 points 1 year ago (1 children)

Disliking is engagement. Instead tap the three dots and select "not interested"

load more comments (1 replies)
load more comments (4 replies)
load more comments (2 replies)
[–] Poob@lemmy.ca 53 points 1 year ago* (last edited 1 year ago) (4 children)

If I accidently watch a Linus Tech tips video, that will be all it recommends me for the next month.

I watched a Some More News video criticizing Jordan Peterson, and Google thought "did I hear Jordan Peterson? Well in that case, here's 5 of his videos!"

Almost all content algorithms are hot garbage that are not interested in serving you what you want, just what makes money. It always ends up serving right wing nut jobs because that conspiracy theorists watch a lot of scam videos.

Edit: my little jab at Linus has nothing to do with politics. I have no idea what his views are. I only mentioned it to point out how YouTube will annihilate my recommendations if I watch a single one of his videos.

load more comments (4 replies)
[–] const_void@lemmy.ml 32 points 1 year ago

Because 'conservative' content gets a lot of engagement (ie ad money). The more they recommend it the bigger the audience, the bigger ad payout. They're literally monetizing hate.

[–] Cyder@lemm.ee 29 points 1 year ago (6 children)

Maybe you have the same problem I have: my wife is still a republican. When that kind of stuff shows up, I know she has been watching it on the family PC. She's not that tech savvy, so I usually go in later and block or limit some of it. It's a pain to fight the algorithms.

load more comments (5 replies)
[–] Machinist@lemmy.world 29 points 1 year ago (1 children)

Machining and blacksmithing are highly correlated with right wing BS in the US. Check my uname and ask me how I know. 😁

load more comments (1 replies)
[–] bobthened@feddit.uk 25 points 1 year ago* (last edited 1 year ago) (1 children)

My interest are gaming, standup comedy, memes, react videos, metalurgy, machining, blacksmithing,

Because a lot of the type of guys who like seeing those stupid conservative videos also like many of the same things as you. Gaming^(TM) is well known to have a problem with the alt-right, react videos have a very similar structure to conservative β€œlibs destroyed with fact and logic” types of videos, and finally a lot of conservatives like to think of themselves as an old fashioned man’s-man so they enjoy things like metalwork and other typical β€œmanly” careers.

load more comments (1 replies)
[–] RocksForBrains@lemm.ee 25 points 1 year ago (1 children)

Conservative proganda is a highly coordinated and well-funded network. They pay for preference.

load more comments (1 replies)
[–] SeaJ@lemm.ee 24 points 1 year ago (1 children)

Your interests have a strong correlation with people on the right aside from maybe react videos.

But even if your interested were not so strongly correlated with the right, you would probably still get right wing ads or videos suggested. They garner the highest engagement because it is often outrage porn. Google gets their money that way. My subscriptions are to let wing political channels, science, and solar channels but I still get a decent amount of PragerU and Matt Walsh ads. Reporting then does not stop them from popping up either.

[–] cuppaconcrete@aussie.zone 6 points 1 year ago

Yeah big tech loves to throw dumb stuff your way to piss you off and keep you engaged, even if you've never shown an interest before.

[–] MiloSquirrel@lemmy.ml 24 points 1 year ago

For me if I ever look up warhammer 40k it immediately starts sending me losers like quartering or other channels like him.

Like, bo youtube, I don't want to hear about how feminism and "wokes" are ruining warhammer. I also don't want to be sent Sargon of Carl videos. Uhg. Lmao

[–] NoMoreCocaine@lemm.ee 21 points 1 year ago (2 children)

Not to be that guy, but you can say "don't recommend this channel again" to YouTube. I haven't seen Quartering, Asmogold, etc for years now. Unless you search for them.

[–] Tartas1995@discuss.tchncs.de 11 points 1 year ago (1 children)

I had the issue that I got Andrew Tate shit recommended. I said don't recommend that, and block the uploader. Youtube still suggested me that video. Exactly that video.

load more comments (1 replies)
load more comments (1 replies)
[–] NightOwl@lemmy.one 20 points 1 year ago

Algorithms. It's why I use newpipe on mobile and freetube on desktop. Both allow me to import and export my subscription list, and block ads and have sponsorblock support.

But, most importantly it doesn't make use of the Google algorithm to try and recommend me videos that might be of the radical nature on misinterpretation of past viewing history.

[–] masquenox@lemmy.ml 18 points 1 year ago

Right-wing media is well-funded.

[–] wtvr@sh.itjust.works 16 points 1 year ago (1 children)

It's because you are a guy in your 30s. You (and me) are in a shitty demographic so the algorithm peddles us shit. It doesn't matter how many times I list shitheel vids by the ppl you me mentioned as "do not recommend channel", I will always have Rogan or Peterson popping up in my feed to ruin my day. It drives me crazy

load more comments (1 replies)
[–] GentlemenPreferBongs@lemm.ee 15 points 1 year ago

I've been vegetarian since 2009 and in the past 2-3 years I keep getting all kinds of bow hunting videos.

It's particularly amusing when my plan was to watch an old episode of "The Golden Girls."

They REALLY misread my demographic.

[–] stiephel@feddit.de 15 points 1 year ago (1 children)

I'm 30 and have a small family, too. When I watch shorts on YouTube I get the exact same content you're describing. None of the long videos I'm watching are political, yet the Algo keeps throwing them at me. I get a lot of Jordan Peterson crap or lil Wayne explaining how there's no racism. I hate it.

The lil Wayne stuff is so strange. Usually when some specific celebrity pops up in my feed I assume their publicist is rehabbing their image after some public incident or recently exposed private conflict. YouTube shorts isn't like that.

YouTube shorts seems to be exclusively just a pipeline to rightwing talking points and the unfunniest parts of stand up comedy framed to serve the same rightwing pipeline.

[–] HubbleST@lemm.ee 15 points 1 year ago

One thing I noticed about browsing the youtube homepage on PC is if your mouse hovers over a video, it starts playing, and that puts it in your watch history. So you might be accidentally adding a trash video it recommended to your watch history while looking at the other offerings. You can disable the "mouse hover auto play" by clicking your profile pic in the top right > settings > playback and performance > inline playback.

[–] donut4ever@sh.itjust.works 14 points 1 year ago (1 children)

You've probably watched the other side quite often, so YouTube is recommending this for ya to enrage you and keep you engaged. Just good old rage bating

load more comments (1 replies)
[–] buckykat@lemmy.fmhy.ml 13 points 1 year ago (2 children)

three dots -> don't recommend channel

use it on anything even a little sus

[–] primalmotion@lemmy.antisocial.ly 6 points 1 year ago (2 children)
load more comments (2 replies)
load more comments (1 replies)
[–] nekat_emanresu@lemmy.ml 12 points 1 year ago (7 children)

Conseratives and fascists are the same group, so I'll refer to them as fascists.

You are talking about one of the core criticisms of corporate secret algorithms to determine what to influence you with. Fascism is forced to creep into everyones world view when you use standard social media, and the average person wouldn't have the slightest idea. Certain key things will be more related to fascist content, like philosophy, psychology, guns, comedy. If you think about what fascists enjoy, or what they need to slander then it makes what I said make more sense.

Jordan Peterson does a lot of vids around psych/philosophy to redirect curious people to false answers that are close to true but more agreeable for fascists. An example of a psychological cooption is "mass psychosis" being coopted into "mass formation psychosis" by fascists. Mass psychosis explains too many true things, where mass formation psychosis redirects people towards a more palletable direction for them.

This is why I want to be nowhere near corporate media if possible. If you delete your cookies(or private browse for the same effect) then youtube will promote the most adjacent things to what you watch like old youtube used to do, although it'll still promote fascism when directly adjacent. With cookies though they have an excuse to have questionable content linger statistically too often.

load more comments (7 replies)

A while back the conservative party of Canada was caught inserting MGTOW / Ben Shapiro tags on all their Youtube uploads. In other words they were poisoning peoples social graph in order to cause exactly what you're talking about.

This is why I use a Google account that is only for Youtube entertainment. I keep it on a separate chromium profile. I turn on all the privacy toggles in the Google account. Only Youtube history is turned on. I curate the watch history.

You cannot tell what content might have breadcrumbs that eventually open the floodgates of far right echo chambers. They do this intentionally. So it requires active measures on your part to counter them. You've got to manage your account with intention. I do not use that account at all for random browsing. I usually do that in incognito on a different browser.

[–] TwoGems@lemmy.world 10 points 1 year ago* (last edited 1 year ago) (2 children)

Because the algorithms favor alt right garbage heaps and the companies will never bother to fix them. Hence why we need some regulations.

load more comments (2 replies)
[–] AlexWIWA@lemmy.ml 10 points 1 year ago

If you watch any kind of gaming videos, and haven't trained your algorithm, then you'll get flooded with this shit

[–] PerogiBoi@lemmy.ca 9 points 1 year ago

Those types of videos have the most engagement. YouTube is trying to show you whatever it thinks will keep you there longer.

Turns out conservative radicalization keeps people longer there. I’ve never googled or watched any Andrew Tate videos but my recommended has at least 3 videos front and center.

[–] bownt@lemmy.ml 9 points 1 year ago

you are a closet conservative. the algoritham has spoken.

[–] EtnaAtsume@lemmy.ml 8 points 1 year ago

"Hi, I'm a guy in early thirties--"

Well there's your answer.

[–] WndyLady@lemm.ee 8 points 1 year ago (1 children)

I very rarely use YouTube, but I've noticed that when I start listening to older top 40 or contemporary bluegrass, those videos drop in

[–] RickyRigatoni@lemmy.ml 23 points 1 year ago

Youtube really said "you're old, you must be racist" lmao

[–] macwinux@lemmy.ml 8 points 1 year ago

I bought a new phone a few months back, and just for shits and giggles I tried looking at YouTube Shorts without any account logged in. I clicked on a food Shorts, and within 5 Shorts, I got a "manosphere" video, and within 10 Shorts I got Ben Shapiro.

Unfortunately, fear and rage drive engagement, and these Conservative grifters are more than happy to give it to them and more. That on top of corporations being Capitalists and thus right-wing by default.

[–] CurlyWurlies4All@prxs.site 7 points 1 year ago
[–] ChaoticEntropy@feddit.uk 7 points 1 year ago* (last edited 1 year ago)

I almost never allow it

The times you do allow it are all the algorithm cares about, sadly. Any kind of engagement is great for companies.

"Hate Rogan? Cool, watch some Rogan as hateporn, hate watching is still watching."

[–] nom_nom@lemmy.ml 7 points 1 year ago* (last edited 1 year ago)

A few years ago I downloaded a browser extension to stop showing me recommended videos, both on the homepage and on the side of videos. I can only watch what I'm subscribed to and what I search for - you'd think its a big sacrifice because you can't discover as many videos, but in reality I've gained so much more of my time back and control over what I actually want to watch.

[–] ezmack@lemmy.ml 6 points 1 year ago

Hi, I am a guy in early thirties with a wife and two kids and whenever I go on youtube it always suggests conservative things like guys dunking on women

Yeah it really wants me to watch Tim pool. Its like cmon guys I'm too old for that guys tastes. Tiktok will atleast take the hint

metalurgy, machining, blacksmithing

That could be it. I don't know YouTube's algorithm, but typically they work by finding what other users watch the videos you watch and recommending you other videos those people also watched. I wouldn't be surprised if the guys watching blacksmithing videos also tend to watch Joe Rogan and the like.

[–] PowerCrazy@lemmy.ml 6 points 1 year ago

The algorithm wants engagement first and foremost (positive vs negative is irrelevant), after that it wants to push view points that preserve the status quo since change is scary to shareholders. So of course capitalist/fascist propaganda is preferred especially if the host is wrong about basic facts (being wrong drives engagement.)

[–] foggy@lemmy.world 5 points 1 year ago* (last edited 1 year ago) (1 children)

It sees you like standup comedy (POTENTIALLY ANTI-WOKE) and gaming (POTENTIALLY INCEL) and in your 30s (LIKELY HAS SOME DISPOSABLE INCOME)

So it's pipelining you to the annals of YouTube that check those boxes, have very good viewer retention and have good user engagement.

The ones with good user engagement are the ones that raise your blood pressure. They make you angry. So, politics, or dirty cops, interrogations, murder stories, stuff like that.

You can resist them all you want, but stuff that makes people angry and chatty and commenty and re-watchy is like cocaine to the algorithm. It makes them money, so they spam it everywhere it even remotely makes sense to try to get you stuck in their quicksand.

load more comments (1 replies)
load more comments
view more: next β€Ί