this post was submitted on 12 May 2024
158 points (100.0% liked)

Technology

37747 readers
201 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

Archived link.

On Jan. 6, 2021, QAnon conspiracy theorists played a significant role in inciting Donald Trump supporters to storm the Capitol building in D.C., hoping to overturn the 2020 election in favor of Trump.

Days later, Twitter suspended tens of thousands of QAnon accounts, effectively banning most users who promote the far-right conspiracy theory.

Now, a new study from Newsguard has uncovered that since Elon Musk acquired the company, QAnon has had a resurgence on X, formerly Twitter, over the past year.

QAnon grows on X

Tracking commonly used QAnon phrases like "QSentMe," "TheGreatAwakening," and "WWG1WGA" (which stands for "Where We Go One, We Go All"), Newsguard found that these QAnon-related slogans and hashtags have increased a whopping 1,283 percent on X under Musk.

From May 1, 2023 to May 1, 2024, there were 1.12 million mentions of these QAnon supporter phrases on X. This was a huge uptick from the 81,100 mentions just one year earlier from May 1, 2022 to May 1, 2023.

One of the most viral QAnon-related posts of the year, on the "Frazzledrip" conspiracy, has received more than 21.8 million views, according to the report. Most concerning, however, is that it was posted by a right-wing influencer who has specifically received support from Musk.

The Jan. 2024 tweet was posted by @dom_lucre, a user with more than 1.2 million followers who commonly posts far-right conspiracy theories. In July 2023, @dom_lucre was suspended on then-Twitter. Responding to @dom_lucre's supporters, Musk shared at the time that @dom_lucre was "suspended for posting child exploitation pictures."

Sharing child sexual abuse material or CSAM would result in a permanent ban on most platforms. However, Musk decided to personally intervene in favor of @dom_lucre and reinstated his account.

Since then, @dom_lucre has posted about how he earns thousands of dollars directly from X. The company allows him to monetize his conspiratorial posts via the platform's official creator monetization program.

Musk has also previously voiced his support for Jacob Chansely, a QAnon follower known as the "QAnon Shaman," who served prison time for his role in the Jan. 6 riot at the Capitol.

The dangers of QAnon

QAnon's adherents follow a number of far-right conspiracy theories, but broadly (and falsely) believe that former President Trump has been secretly battling against a global cabal of Satanic baby-eating traffickers, who just happen to primarily be made up of Democratic Party politicians and Hollywood elites.

Unfortunately, these beliefs have too often turned deadly. Numerous QAnon followers have been involved in killings fueled by their beliefs. In 2022, one Michigan man killed his wife before being fatally shot in a standoff with police. His daughter said her father spiraled out of control as he fell into the QAnon conspiracies. In 2021, another QAnon conspiracy theorists killed his two young children, claiming that his wife had "Serpent DNA" and his children were monsters.

Of course, QAnon never completely disappeared from social media platforms. Its followers still espoused their beliefs albeit in a more coded manner over the past few years to circumvent social media platforms' policies. Now, though, QAnon believers are once again being more open about their radical theories.

The looming November 2024 Presidential election likely plays a role in the sudden resurgence of QAnon on X, as QAnon-believing Trump supporters look to help their chosen candidate. However, Musk and X have actively welcomed these users to their social media service, eagerly providing them with a platform to spread their dangerous falsehoods.

top 34 comments
sorted by: hot top controversial new old
[–] mozz@mbin.grits.dev 36 points 6 months ago (2 children)

I think a large part of this is that X is the only major social media which has no dedicated team for detecting and banning the propaganda bots / troll farms.

I have no idea how much of the Q / antivax / conspiracy material on social media is deliberate campaigns to destabilize American politics in general (as opposed to perfectly organic homegrown nuttiness which the US has always had plenty of anyway), but I know it's not 0.

[–] jarfil@beehaw.org 22 points 6 months ago (1 children)

It's not just to destabilize "American" politics, it's a series of worldwide campaigns to destabilize all information flow, to sow doubt and confusion among everyone, then out of the blue present an aligned front to push a certain narrative.

If people are kept in a "flux state of distrust", they're easier to convince when suddenly a bunch of their sources agree on some point, "it must be true if conflicting sources suddenly say the same".

[–] technocrit@lemmy.dbzer0.com 5 points 6 months ago* (last edited 6 months ago) (1 children)

then out of the blue present an aligned front to push a certain narrative.

This is a good point. I see this alot with ukraine. There are many famous shills (eg. max blumenthal) who have been promoting the fascist invasion of ukraine. Now these same shills are supporting Palestine. This would be good except they are just using the issue to lure people in. Then once they're hooked on all these shady accounts, they start talking about how ukrainians are nazis, how stalin was awesome, etc. It's so transparent but so dangerous. I imagine this happens on many fronts.

edit: Just remembered these podcasts about this: Part 2 and Part 3

[–] jarfil@beehaw.org 1 points 6 months ago* (last edited 6 months ago)

The biggest problem with Ukraine... is that they aren't fully detached from Nazis:

  • During WW2, Ukraine was allied with Nazis and fascists, helping them exterminate Poles
  • 21st century Ukraine, still uses Nazi symbology, the fascist salute, a fascist hymn, has set national support for WW2 Nazi combatants, and even their national shield is a fascist remnant.

All of that has nothing to do with the Russian invasion... but it does give Russia's propaganda machine an awesome excuse. It's just too easy to get people hooked up with some actual facts, then get them to do a leap of faith and fall straight into full propaganda... and Russia knows it.

Israel and Palestine is a particularly juicy case, where there are really shitty groups coming from both sides, ending up like an "all you can eat" buffet for every propaganda machine out there. No matter what narrative one wants to spin, chances are they'll find a latch point in the Israel vs. Palestine conflict, even contradictory ones for different audiences.

[–] cupcakezealot@lemmy.blahaj.zone 2 points 6 months ago

also the fact that the ceo of twitter is responsible for spreading a lot of the misinformation, antivaxx, and conspiracy theory content on twitter.

[–] orca@orcas.enjoying.yachts 18 points 6 months ago (3 children)

Musk is a useful idiot. He ruined Twitter and the US government quietly thanks him for it because it no longer serves as a tool to see unfiltered events happening on the ground (like Israel murdering Palestinians). So mission accomplished there, and now the new target is TikTok.

There’s also this quote said in 1981 by the CIA Director at the time, William Casey:

We'll know our disinformation program is complete when everything the American public believes is false

Source

Meta is already in the government’s pocket—which covers Instagram and Facebook, Elon has basically erased old Twitter, and now TikTok is on the brink of being banned in the US. Pretty fun pattern of events to try and control the narrative.

[–] mozz@mbin.grits.dev 16 points 6 months ago* (last edited 6 months ago) (3 children)

Musk is a useful idiot. He ruined Twitter and the US government quietly thanks him for it because it no longer serves as a tool to see unfiltered events happening on the ground (like Israel murdering Palestinians). So mission accomplished there

Yeah. Twitter back in the day actually used to be a usable substitute for print journalism, without the editorial bias and selective coverage. If you paid attention to who to follow, you could actually get a lot better picture of the world from Twitter than from almost anywhere else.

I don't think the US government is alone in wanting that gone so they can control the narrative instead, but they're definitely one party that was happy about it.

and now the new target is TikTok

And this is where you went straight off the fuckin deep end.

I do not know a single person who gets their picture of the world from Tiktok whose viewpoint isn't reliably dogshit takes on literally every single issue. (Specific e.g. antivax, "BLM protestors are just running around beating people up, they have to be stopped," "everyone's moving out of California to Texas and Florida because Republican politics are better") Maybe there's an accidental alignment of pro-Palestine-protestors news from Tiktok right now, but it's not like that narrative is un-heard-of in any MSM news or other social media. The whole landscape at this point is Palestine flags as far as I see, and the other platforms are usually a lot more nuanced and informative.

I don't know why you'd object to an algorithm controlled by Elon Musk or the US government or just a lawful-evil alignment to sell advertising and hook people to dopamine loops and nothing else (all very good things to be suspicious of, yes), but all of a sudden when the Chinese government's involved, you're like "finally someone trustworthy to put in charge of public opinion, no way this can go wrong."

[–] jarfil@beehaw.org 7 points 6 months ago

TikTok is a weird beast. It can at the same time show the destruction in Gaza, Israeli soldiers poking fun at it, ASMR videos, mindless looping footage, fake AI idols, underage girls asking for payment from strangers, and siphon engagement data to train Chinese propaganda bots.

It's the closest thing to "shove everything into a bowl, then shake"...

[–] kbal@fedia.io 3 points 6 months ago (2 children)

Calling Tiktok "the next target" does not strike me as the ringing endorsement of its noble pursuit of accurate news reporting you seem to be taking it for.

[–] mozz@mbin.grits.dev 6 points 6 months ago (1 children)

The person I was responding to, if I've read them right, was trying to argue that Tiktok was the next target because people could get unfiltered information about the world through it. My point was that Tiktok is about the worst possible tool for getting useful unfiltered information about the world that one could possibly imagine, and then adding some context and detail to that.

[–] kbal@fedia.io 0 points 6 months ago (1 children)

I don't know, it just seemed to me they might have had in mind that whoever is trying to "control the narrative" would find competing disinformation campaigns just as unwelcome.

[–] mozz@mbin.grits.dev 6 points 6 months ago (1 children)

I have this surreal experience sometimes where I'll say something like "I don't think the US government is alone in wanting that gone so they can control the narrative instead" and then find someone lecturing me about how exactly what I just got done saying might be true.

That said, the person I was talking to was clearly implying that banning Tiktok would be a bad thing because people can get unfiltered information through it. You can try to say they were saying something else that's more sensible, if you want. I won't stop you. They don't seem to want to clarify it themselves, so it's hard to say.

[–] orca@orcas.enjoying.yachts 1 points 6 months ago

The TikTok ban is a weird gray area. It’s one of those things where I don’t like any of these tech companies or the fact that they are essentially monetizing terror, and I will never have allegiance to them or the entities that influence them to deceive and profit themselves, but it’s also a pipeline for information either way—which is something we are lacking more and more of every day with the constant deaths of journalists and journalist outlets. People need to remember that no one is immune to propaganda and we should take all of the information with a grain of salt.

But if we’re talking about the things happening in Gaza, any eyes we can get add value. It’s not like this is some new event that we don’t have documented history on. We’re witnessing the next stage of the genocide, and TikTok (hate it or love it) has helped expose a lot of the atrocities (in the same way that Twitter used to).

It’s really not a stretch to say that anti-imperialist governments and individuals benefit from this because it exposes the atrocities the US and Britain have been perpetuating for years. Of course that’s a benefit to them. It’s also a benefit to the people forced to live under the governments committing those atrocities.

[–] orca@orcas.enjoying.yachts 1 points 6 months ago

The thing is, it doesn’t have to lie or twist the truth. All it has to do is give people a window into what the US has been doing around the world for years. The US has done this to itself. The younger generations are onto the ruling class and their playbook is no longer working. They want to ban TikTok now because of the massive amount of unfiltered Gaza content it provides. I should note, TikTok simply promoting this stuff in its algorithm is enough (look up “Heating” in regards to TikTok).

[–] orca@orcas.enjoying.yachts 3 points 6 months ago (1 children)

It’s not about trusting TikTok. It’s about understanding that we have a terrible tech company, vs a terrible government. But at the same time, it has helped radicalize and inform so many in the ranks of Gen Z, amongst other generations. Even if you skip past the 24/7 stream of Gaza coverage ( which we absolutely need since Israel has murdered something like 121 journalists in Gaza), you’ll find videos of Gen Z and other folks making videos that layout documented history, protest methods, agitprop, and their own amateur journalism to others. That has value, even if the source is just in it for the notoriety and money. It’s an awkward position where it becomes a tool to the revolution, because it’s in opposition (for its own gain), but is never trustworthy.

[–] mozz@mbin.grits.dev 5 points 6 months ago (1 children)

But at the same time, it has helped radicalize and inform so many in the ranks of Gen Z, amongst other generations.

I know when I think of people I know who get most of their news from TikTok, I'm like "damn that person is super well informed and I'm always happy when I talk to them about politics and world events"

Out of all the platforms, every single other one of which including the one you're on right now and the elephant one and Usenet and ZMag and Hackernews and all the rest

I do not know which reality you inhabit where TikTok invented people knowing about Gaza, but I promise you that there are better platforms, where you're allowed to talk about drugs or alcohol or use the word "blood", or "Uyghur"

[–] orca@orcas.enjoying.yachts 3 points 6 months ago* (last edited 6 months ago) (1 children)

It didn’t invent it. It just caught on at the right time and amplified that knowledge. It’s also a network. I don’t get TikTok but I’ve seen how popular it is in the new generations. This isn’t new knowledge; it’s just new packaging for a plugged-in-since-birth generation.

I’m too old for TikTok. Old Twitter, Reddit before it went to shit, niche message boards. That’s where I used to hang. Now it’s gathering communities I like on my own Lemmy server.

[–] mozz@mbin.grits.dev 2 points 6 months ago (1 children)

Random question, what's your opinion on the Uyghur re-education camps? Or the treatment of the Hong Kong protestors and how it compares with the treatment of US protestors of aid to Israel?

[–] orca@orcas.enjoying.yachts 1 points 6 months ago* (last edited 6 months ago) (1 children)

I have a feeling these questions are trying to feel out whether or not I’m a fan of the authoritarian flavor of communism and that’s a nope lol. I’m not on Lemmygrad or Hexbear. I don’t like police regardless of the continent or any other factor. We are seeing power vacuums in 2 very different types of authoritarian structures and none of them pan out to anything good for the working class.

I keep it simple: it’s always the working class vs the ruling class.

That said, refreshing myself on the Uyghur internment camps and then comparing them to the recently uncovered concentration camps that Israel has in the desert, and seeing the similarities, is unnerving. One exists to allegedly indoctrinate a population, while the other exists to exterminate it. There is a Venn diagram here somewhere.

[–] mozz@mbin.grits.dev 1 points 6 months ago (1 children)

I didn't ask about how you felt about Israel's genocide. I'm assuming, based on what you already said, that you're against it. So am I.

If you had to narrow down your feelings on the Uyghur internment camps to one of three responses, would it be:

  • I'm against them
  • I'm for them
  • It's more complicated than that

And, the same question for the police response in Hong Kong.

[–] orca@orcas.enjoying.yachts 1 points 6 months ago (1 children)

I’m against both of them. It’s imprisonment, torture and forced assimilation. It should be an easy decision for anyone that is also against what the Nazis did and Israel is doing.

[–] mozz@mbin.grits.dev 1 points 6 months ago* (last edited 6 months ago) (1 children)

Okay, just curious.

I mean, yes, it should be a shockingly easy question to answer and I'm happy that you're against them. I've just gotten in the habit of asking people who display one view that's surprising to me if they hold other surprising views which might not appear initially to be correlated. Most of the time, they do, which is a very interesting result to me.

[–] orca@orcas.enjoying.yachts 1 points 6 months ago (1 children)

As soon as I smell authoritarianism, it’s a no from me. Even as someone that reads communist and socialist theory. I think people take too much of the past and try to apply it as-is to the world of today. But it doesn’t work. Modern times require modernized ideas, and I sometimes wish people had more imagination.

[–] mozz@mbin.grits.dev 1 points 6 months ago (1 children)

Every government in the world has good and bad in it, because every government is made of people. Different ones have different amounts; it's not like every country's government is the same or has equal good/evil levels. But sometimes people take it to the point of classifying "good ones" and "bad ones" and handwaving away the bad things that the "good ones" are doing. To me, that's not really a safe or sensible way to look at things. It's just not how things work. In my opinion.

[–] orca@orcas.enjoying.yachts 1 points 6 months ago

The “bad” ones just float to the top when a government is around long enough. It’s human nature. The ones seeking power and money see others seeking the same, and they look at them as a rung on a ladder. They help each other because it benefits them. They actively work to shut out the “good” ones because they know they’ll lose their spot to them if they aren’t careful. Suddenly the “good” ones are lost in the noise. The incorruptible become powerless.

It’s similar to the police force in the US, for example. People go in with good intentions, but the force has been around long enough that the mad dogs have already permeated all of the top ranks. So the “good” ones either wash out, or they assimilate. One of my relatives washed out, while the one that had been there much longer turned into a massive racist. Which one do you think climbed the ranks quicker?

Power structures automatically attract selfish, self-serving people that just become worse offenders the more wealthy and powerful they become. Power is a vacuum after all, and there are plenty of bad people standing in line to fill it. This is why I don’t trust a single politician. Whether they want to admit it or not, under that facade of wanting to change their city or “do good,” there is some underlying desire for power and attention. That doesn’t mean none of them do anything good in their time though; it just means we should be cautious.

[–] PaddleMaster@beehaw.org 5 points 6 months ago* (last edited 6 months ago)

And most news papers were acquired by the same handful of media companies. In turn these companies ravaged local markets and there’s just no coverage of the actual truth, even on local happenings.

There’s an article about my hometown covered by NY times or something (I forget, it’s been a few years). We had a flourishing newspaper that employed a decent amount of the community, when that article came out (2010ish) the same company had 3 reporters and 5 staff. The newspaper would cover legitimate issues locally and nationally. They had amazing journalists that promoted great things happening too (local studies, non profits doing the hard work to benefit the community, etc). Basically, the boring stuff that isn’t flashy enough for social media. And now it’s all gone.

I legitimately have a difficult time finding news stories on any platform that I can trust.

Edit: I just read this, different angle to the same problem https://web.archive.org/web/20240512160438mp_/https://www.theatlantic.com/magazine/archive/2024/06/china-russia-republican-party-relations/678271/

[–] ASaltPepper@lemmy.one 3 points 6 months ago* (last edited 6 months ago) (1 children)

I don't know about not seeing IDF forces killing Palestinians. I'd argue the two places to see that are Twitter and Telegram. At least in terms of lack of censorship and ease of locating.

So much so that the Washington Post wrote about the two platforms carrying the most gruesome images here.

[–] orca@orcas.enjoying.yachts 2 points 6 months ago

Oh shit, I forgot all about Telegram. Bluesky is in the ring now too, but I don’t really know much about it.

[–] Introversion@kbin.social 10 points 6 months ago
[–] intensely_human@lemm.ee 1 points 6 months ago (2 children)

Given I’ve been described as a right with conspiracy theorist for saying that capitalist countries experience less starvation than socialist ones, I’m going to have to take this assessment with a grain of salt.

Maybe it’s more like right wingers in general are coming back in droves after finding an online community that won’t ban them for their political affiliation.

[–] BarryZuckerkorn@beehaw.org 3 points 6 months ago (1 children)

Given I’ve been described as a right with conspiracy theorist for saying that capitalist countries experience less starvation than socialist ones, I’m going to have to take this assessment with a grain of salt.

That's not the methodology used, unless your description of starvation literally includes QAnon hashtags:

Tracking commonly used QAnon phrases like "QSentMe," "TheGreatAwakening," and "WWG1WGA" (which stands for "Where We Go One, We Go All"), Newsguard found that these QAnon-related slogans and hashtags have increased a whopping 1,283 percent on X under Musk.

And if not, then I'm not sure what your observations add to the discussion.

[–] ebu@awful.systems 3 points 6 months ago* (last edited 6 months ago)

"i reflexively identify with the openly-fascist right-wing base that has found its home on elon's twitter, and since i'm a reasonable person, the evidence that they're flagrantly conspiracy-minded and/or are CSAM posters simply must be fabricated"

[–] darkphotonstudio@beehaw.org 3 points 6 months ago

Yeah, it couldn’t possibly be because they’re a bunch of bigots in a delusional cult violating TOS agreements and generally being shitty people.

[–] onlinepersona@programming.dev 1 points 6 months ago

And people that don't belong to that group still use Twitter. Apparently people like living a sump.

Anti Commercial-AI license