this post was submitted on 13 Jul 2025
94 points (99.0% liked)

Fuck AI

3379 readers
423 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[โ€“] SaneMartigan@aussie.zone 6 points 1 day ago (1 children)

I was a physiotherapist and the AI recommendations for physical/mechanical health feel like someone grabbed a diagnosis from a lucky dip of options. It sounds very professional but doesn't specifically diagnose issues for the client.

[โ€“] hendrik@palaver.p3x.de 4 points 23 hours ago

From what I gather about current chatbots, they always sound very eloquent. They're made that way with all the textbooks and Wikipedia articles that went in. But they're not necessarily made to do therapy. ChatGPT etc are more general purpose and meant for a wide variety of tasks. And the study talks about current LLMs. So it'd be like a layman with access to lots of medical books, and they pick something that sounds very professional. But they wouldn't do what an expert does, like follow an accepted and tedious procedure, do tests, examine, diagnosis and whatever. An AI chatbot (most of the times) gives answers anyways. So could be a dice roll and then the "solution". But it's not clear whether they have any understanding of anything. And what makes me a bit wary is that AI tends to be full of stereotypes and bias, it's often overly agreeable and it praises me a lot for my math genious when I discuss computer programming questions with it. Things like that would certainly feel nice if you're looking for help or have a mental issue or looking for reaffirmation. But I don't think those are good "personality traits" for a therapist.