> I have seen more than once people using ChatGPT as a psychologist and psychiatrist and self diagnosing mental illnesses.
I'm working with a company that writes AI tools for mental health professionals. The output of the models _never_ goes to the patient.
I've heard quite a few mouthfuls of how harmful LLM advice is for mental health patients - with specific examples. I'm talking on the order of suicide attempts (and at least one success) and other harmful effects to the patient and their surroundings as well. If you know of someone using an LLM for this purpose, please encourage them to seek professional help. Or just point them at this comment - anybody is encouraged to contact me. I'm not a mental health practitioner, but I work with them and I'll do my best to answer their questions or point them to someone who can. My Gmail username is the same as my HN username.
Professionals have confirmation bias too and the LLM will hook into that regardless of patient or professional. It's so good at it that even "training" the professional received can be subtly bypassed.
Basically there needs to be an LLM that can push back.
I'd love to hear more about how you see it. You're absolutely not wrong. I'm writing tools in this space, so if you have experience or examples to share - I've got a huge file of them - I am interested.
You are invited to answer here or in private. My Gmail username is the same as my HN username.
Judgement can be expressed in the form of tokens. Yes LLMs are just token generators. But the tokens they generate express judgement in a way that is indistinguishable from actual judgment.
So who cares. It's like saying nobody drives a car because the thing that's moving them is just a thing that is not a car but acts like a car to an extent that it is indistinguishable. Ok then. It's a car. Good day sir.
I've heard quite a few mouthfuls of how harmful LLM advice is for mental health patients - with specific examples. I'm talking on the order of suicide attempts (and at least one success) and other harmful effects to the patient and their surroundings as well. If you know of someone using an LLM for this purpose, please encourage them to seek professional help. Or just point them at this comment - anybody is encouraged to contact me. I'm not a mental health practitioner, but I work with them and I'll do my best to answer their questions or point them to someone who can. My Gmail username is the same as my HN username.