You most probably viewed these types of videos a few times and the algorithm started recommending them to you. It only takes a couple of videos for the algorithm to start recommending video of the same topic. You could easily solve this by clicking on the 3 dots next to the video and then selecting "Not interested", do it enough times and they'll be gone from your feed.
Asklemmy
A loosely moderated place to ask open-ended questions
Search asklemmy π
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- !lemmy411@lemmy.ca: a community for finding communities
~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~
This, the algorithm doesn't care whether you ~~like~~ enjoy it or not, it cares whether you engage with it or not. Even dislikes are engagement.
I'm not talking about liking the video, I'm talking about viewing the video. Viewing the video, especially a large part of it, will contribute to the watch time, therefore affecting the algorithm's recommendations.
I meant like as in "enjoy" not as in click the thumbs up button, sorry for the confusion. Even a video that pisses you off, that you thumb it down along with every comment is considered engagement and it not only feeds you more but also boost the video as a whole.
All it cares is whether it keeps you on YouTube. It's like news, outrage is profitable even if everyone hates it.
I am constantly bombarded with Jordan Peterson videos despite disliking them and telling the algorithm to show less like this.
Iβm not sure how it profiles people, but it sucks.
Disliking is engagement. Instead tap the three dots and select "not interested"
If I accidently watch a Linus Tech tips video, that will be all it recommends me for the next month.
I watched a Some More News video criticizing Jordan Peterson, and Google thought "did I hear Jordan Peterson? Well in that case, here's 5 of his videos!"
Almost all content algorithms are hot garbage that are not interested in serving you what you want, just what makes money. It always ends up serving right wing nut jobs because that conspiracy theorists watch a lot of scam videos.
Edit: my little jab at Linus has nothing to do with politics. I have no idea what his views are. I only mentioned it to point out how YouTube will annihilate my recommendations if I watch a single one of his videos.
Because 'conservative' content gets a lot of engagement (ie ad money). The more they recommend it the bigger the audience, the bigger ad payout. They're literally monetizing hate.
Maybe you have the same problem I have: my wife is still a republican. When that kind of stuff shows up, I know she has been watching it on the family PC. She's not that tech savvy, so I usually go in later and block or limit some of it. It's a pain to fight the algorithms.
Machining and blacksmithing are highly correlated with right wing BS in the US. Check my uname and ask me how I know. π
My interest are gaming, standup comedy, memes, react videos, metalurgy, machining, blacksmithing,
Because a lot of the type of guys who like seeing those stupid conservative videos also like many of the same things as you. Gaming^(TM) is well known to have a problem with the alt-right, react videos have a very similar structure to conservative βlibs destroyed with fact and logicβ types of videos, and finally a lot of conservatives like to think of themselves as an old fashioned manβs-man so they enjoy things like metalwork and other typical βmanlyβ careers.
Conservative proganda is a highly coordinated and well-funded network. They pay for preference.
Your interests have a strong correlation with people on the right aside from maybe react videos.
But even if your interested were not so strongly correlated with the right, you would probably still get right wing ads or videos suggested. They garner the highest engagement because it is often outrage porn. Google gets their money that way. My subscriptions are to let wing political channels, science, and solar channels but I still get a decent amount of PragerU and Matt Walsh ads. Reporting then does not stop them from popping up either.
Yeah big tech loves to throw dumb stuff your way to piss you off and keep you engaged, even if you've never shown an interest before.
For me if I ever look up warhammer 40k it immediately starts sending me losers like quartering or other channels like him.
Like, bo youtube, I don't want to hear about how feminism and "wokes" are ruining warhammer. I also don't want to be sent Sargon of Carl videos. Uhg. Lmao
Not to be that guy, but you can say "don't recommend this channel again" to YouTube. I haven't seen Quartering, Asmogold, etc for years now. Unless you search for them.
I had the issue that I got Andrew Tate shit recommended. I said don't recommend that, and block the uploader. Youtube still suggested me that video. Exactly that video.
Algorithms. It's why I use newpipe on mobile and freetube on desktop. Both allow me to import and export my subscription list, and block ads and have sponsorblock support.
But, most importantly it doesn't make use of the Google algorithm to try and recommend me videos that might be of the radical nature on misinterpretation of past viewing history.
Right-wing media is well-funded.
It's because you are a guy in your 30s. You (and me) are in a shitty demographic so the algorithm peddles us shit. It doesn't matter how many times I list shitheel vids by the ppl you me mentioned as "do not recommend channel", I will always have Rogan or Peterson popping up in my feed to ruin my day. It drives me crazy
I've been vegetarian since 2009 and in the past 2-3 years I keep getting all kinds of bow hunting videos.
It's particularly amusing when my plan was to watch an old episode of "The Golden Girls."
They REALLY misread my demographic.
I'm 30 and have a small family, too. When I watch shorts on YouTube I get the exact same content you're describing. None of the long videos I'm watching are political, yet the Algo keeps throwing them at me. I get a lot of Jordan Peterson crap or lil Wayne explaining how there's no racism. I hate it.
The lil Wayne stuff is so strange. Usually when some specific celebrity pops up in my feed I assume their publicist is rehabbing their image after some public incident or recently exposed private conflict. YouTube shorts isn't like that.
YouTube shorts seems to be exclusively just a pipeline to rightwing talking points and the unfunniest parts of stand up comedy framed to serve the same rightwing pipeline.
One thing I noticed about browsing the youtube homepage on PC is if your mouse hovers over a video, it starts playing, and that puts it in your watch history. So you might be accidentally adding a trash video it recommended to your watch history while looking at the other offerings. You can disable the "mouse hover auto play" by clicking your profile pic in the top right > settings > playback and performance > inline playback.
You've probably watched the other side quite often, so YouTube is recommending this for ya to enrage you and keep you engaged. Just good old rage bating
three dots -> don't recommend channel
use it on anything even a little sus
Conseratives and fascists are the same group, so I'll refer to them as fascists.
You are talking about one of the core criticisms of corporate secret algorithms to determine what to influence you with. Fascism is forced to creep into everyones world view when you use standard social media, and the average person wouldn't have the slightest idea. Certain key things will be more related to fascist content, like philosophy, psychology, guns, comedy. If you think about what fascists enjoy, or what they need to slander then it makes what I said make more sense.
Jordan Peterson does a lot of vids around psych/philosophy to redirect curious people to false answers that are close to true but more agreeable for fascists. An example of a psychological cooption is "mass psychosis" being coopted into "mass formation psychosis" by fascists. Mass psychosis explains too many true things, where mass formation psychosis redirects people towards a more palletable direction for them.
This is why I want to be nowhere near corporate media if possible. If you delete your cookies(or private browse for the same effect) then youtube will promote the most adjacent things to what you watch like old youtube used to do, although it'll still promote fascism when directly adjacent. With cookies though they have an excuse to have questionable content linger statistically too often.
A while back the conservative party of Canada was caught inserting MGTOW / Ben Shapiro tags on all their Youtube uploads. In other words they were poisoning peoples social graph in order to cause exactly what you're talking about.
This is why I use a Google account that is only for Youtube entertainment. I keep it on a separate chromium profile. I turn on all the privacy toggles in the Google account. Only Youtube history is turned on. I curate the watch history.
You cannot tell what content might have breadcrumbs that eventually open the floodgates of far right echo chambers. They do this intentionally. So it requires active measures on your part to counter them. You've got to manage your account with intention. I do not use that account at all for random browsing. I usually do that in incognito on a different browser.
Because the algorithms favor alt right garbage heaps and the companies will never bother to fix them. Hence why we need some regulations.
If you watch any kind of gaming videos, and haven't trained your algorithm, then you'll get flooded with this shit
Those types of videos have the most engagement. YouTube is trying to show you whatever it thinks will keep you there longer.
Turns out conservative radicalization keeps people longer there. Iβve never googled or watched any Andrew Tate videos but my recommended has at least 3 videos front and center.
you are a closet conservative. the algoritham has spoken.
"Hi, I'm a guy in early thirties--"
Well there's your answer.
I very rarely use YouTube, but I've noticed that when I start listening to older top 40 or contemporary bluegrass, those videos drop in
Youtube really said "you're old, you must be racist" lmao
I bought a new phone a few months back, and just for shits and giggles I tried looking at YouTube Shorts without any account logged in. I clicked on a food Shorts, and within 5 Shorts, I got a "manosphere" video, and within 10 Shorts I got Ben Shapiro.
Unfortunately, fear and rage drive engagement, and these Conservative grifters are more than happy to give it to them and more. That on top of corporations being Capitalists and thus right-wing by default.
I almost never allow it
The times you do allow it are all the algorithm cares about, sadly. Any kind of engagement is great for companies.
"Hate Rogan? Cool, watch some Rogan as hateporn, hate watching is still watching."
A few years ago I downloaded a browser extension to stop showing me recommended videos, both on the homepage and on the side of videos. I can only watch what I'm subscribed to and what I search for - you'd think its a big sacrifice because you can't discover as many videos, but in reality I've gained so much more of my time back and control over what I actually want to watch.
Hi, I am a guy in early thirties with a wife and two kids and whenever I go on youtube it always suggests conservative things like guys dunking on women
Yeah it really wants me to watch Tim pool. Its like cmon guys I'm too old for that guys tastes. Tiktok will atleast take the hint
metalurgy, machining, blacksmithing
That could be it. I don't know YouTube's algorithm, but typically they work by finding what other users watch the videos you watch and recommending you other videos those people also watched. I wouldn't be surprised if the guys watching blacksmithing videos also tend to watch Joe Rogan and the like.
The algorithm wants engagement first and foremost (positive vs negative is irrelevant), after that it wants to push view points that preserve the status quo since change is scary to shareholders. So of course capitalist/fascist propaganda is preferred especially if the host is wrong about basic facts (being wrong drives engagement.)
It sees you like standup comedy (POTENTIALLY ANTI-WOKE) and gaming (POTENTIALLY INCEL) and in your 30s (LIKELY HAS SOME DISPOSABLE INCOME)
So it's pipelining you to the annals of YouTube that check those boxes, have very good viewer retention and have good user engagement.
The ones with good user engagement are the ones that raise your blood pressure. They make you angry. So, politics, or dirty cops, interrogations, murder stories, stuff like that.
You can resist them all you want, but stuff that makes people angry and chatty and commenty and re-watchy is like cocaine to the algorithm. It makes them money, so they spam it everywhere it even remotely makes sense to try to get you stuck in their quicksand.