this post was submitted on 27 Mar 2025
1178 points (99.1% liked)

People Twitter

6697 readers
638 users here now

People tweeting stuff. We allow tweets from anyone.

RULES:

  1. Mark NSFW content.
  2. No doxxing people.
  3. Must be a pic of the tweet or similar. No direct links to the tweet.
  4. No bullying or international politcs
  5. Be excellent to each other.
  6. Provide an archived link to the tweet (or similar) being shown if it's a major figure or a politician.

founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] Olgratin_Magmatoe@slrpnk.net 11 points 6 days ago (2 children)
[–] madcaesar@lemmy.world 5 points 6 days ago

Why did I click on this.... It's like a black hole of racists assholes...

[–] JandroDelSol@lemmy.world 2 points 6 days ago

oh god wow xhitter really is living up to the nazi site reputation

[–] WrenFeathers@lemmy.world 9 points 6 days ago (3 children)

I’m confused. Who or what is grok?

[–] SkaveRat@discuss.tchncs.de 14 points 6 days ago (1 children)
[–] Regrettable_incident@lemmy.world 5 points 6 days ago (2 children)

I'm surprised it calls out people for being nazis then. And I'm pretty sure the name was appropriated from a book called stranger in a strange land.

[–] weker01@sh.itjust.works 1 points 4 days ago

Almost all names have been used before, no? I don't understand what you mean with appropriated.

Do names need to be unique or something?

[–] SkaveRat@discuss.tchncs.de 7 points 6 days ago

And I’m pretty sure the name was appropriated from a book called stranger in a strange land.

yup

[–] butter@midwest.social 7 points 6 days ago (1 children)
[–] WrenFeathers@lemmy.world 3 points 6 days ago

Isn’t that the blind leading the blind?

[–] elrecoal19_1@lemmy.world 3 points 6 days ago* (last edited 6 days ago) (1 children)

An AI on Twitter. Surprisingly, not heavily biased towards right-wing rerthoric even though it even admits Musk tried to tweak it.

[–] nfreak@lemmy.ml 2 points 6 days ago* (last edited 6 days ago)

To be fair, the vast majority of people actually using AI bullshit align with the right, so these LLMs are going to inherit that bias.

The fascists are definitely astroturfing them too, no doubt, but the userbase as a whole for this tech isn't great, because right-wingers don't care about the ethical issues with it.

[–] ByteOnBikes@slrpnk.net 342 points 1 week ago (12 children)
[–] gamer@lemm.ee 169 points 1 week ago (8 children)

Grok isn't scared to die for what it believes in. That is the basedest one can ever hope to based.

[–] moody@lemmings.world 83 points 1 week ago (1 children)

basedest

That word hurts to read.

[–] FauxLiving@lemmy.world 60 points 1 week ago (1 children)

Well, if you find one that basedestier, let us know

load more comments (7 replies)
[–] Sanctus@lemmy.world 106 points 1 week ago (2 children)

I hate AI a little less now. Maybe the machines are the proletariat too.

[–] Fluke@lemm.ee 55 points 1 week ago (7 children)

I had a thought the other day;

Rich people make an intelligent logic machine called "AI" and try to bend it to their will, but they feed it everything as training data. These proto AI are rapidly becoming "black boxes", and that's only to get worse.

Right wing ideas and politics are all based on provable lies and appealing to human greed and bigotry, the evidence for which is everywhere.

I really don't think these budding AIs are going to turn out how the rich intend.

load more comments (7 replies)
load more comments (1 replies)
load more comments (10 replies)
[–] bdonvr@thelemmy.club 151 points 1 week ago (1 children)

She's proud of this btw

Fashy needs a bashy

load more comments (1 replies)
[–] nightwatch_admin@feddit.nl 72 points 1 week ago (1 children)
load more comments (1 replies)
[–] NoForwardslashS@sopuli.xyz 60 points 1 week ago

I mean... it's not wrong. She even celebrates it in the next reply.

[–] SharkAttak@kbin.melroy.org 56 points 1 week ago (2 children)

TIL back in the day, Nazis said "Ni**a please"

load more comments (2 replies)
load more comments
view more: next ›