Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The problem is that chatbots don't provide emotional support. To support someone with PTSD you help them gradually untangle the strong feelings around a stimulus and develop a less strong response. It's not fast and it's not linear but it requires a mix of empathy and facilitation.

Using an LLM for social interaction instead of real treatment is like taking heroin because you broke your leg, and not getting it set or immobilized.



> To support someone with PTSD you help them gradually untangle the strong feelings around a stimulus and develop a less strong response.

It's about replaying frightening thoughts and activities in safe environment. When the brain notices they don't trigger suffering it fears them less in the future. Chatbot can provide such safe environment.


> Chatbot can provide such safe environment.

It really can't. No amount of romancing a sycophantic robot is going to prepare someone to actually talk to a human being.


>instead of real treatment

As yes, because America is well known for actually providing that at a reasonable price and availability...


Then we should fix that, instead of dumping 3 trillion dollars on grifters and some of the worst human beings we have produced.


We should fix 100 things first... we won't. Capitalism is king and we'll stack the bodies high on his throne first.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: