●Stories
●Firehose
●All
●Popular
●Polls
●Software
●Thought Leadership
Submit
●
Login
●or
●
Sign up
●Topics:
●Devices
●Build
●Entertainment
●Technology
●Open Source
●Science
●YRO
●Follow us:
●RSS
●Facebook
●LinkedIn
●Twitter
●
Youtube
●
Mastodon
●Bluesky
Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!
Forgot your password?
Close
This discussion has been archived.
No new comments can be posted.
Load All Comments
Full
Abbreviated
Hidden
/Sea
Score:
5
4
3
2
1
0
-1
More
Login
Forgot your password?
Close
Close
Log In/Create an Account
●
All
●
Insightful
●
Informative
●
Interesting
●
Funny
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
bywakeboarder ( 2695839 ) writes:
AI doesn't have feelings and doesn't understand what it's like to be human. If you don't have that, talking to an AI therapist is not any better than reading a book. Don't even think for 2 seconds that you can replace a therapist with a computer.
byxevioso ( 598654 ) writes:
The issues are much more complicated, which is why this article is bullshit. Therapists are licensed in many states, and that is because there are a lot of regulatory issues a licensed therapist needs to comply with in order to legally be allowed to practice. AI does not have to abide by these things.
For example, if a child therapist overhears comments in a therapy session indicating the child has been physically or sexually abused, the therapist is *legally* required to make a CPS (Child Protective Servi
byHoli ( 250190 ) writes:
" if a child therapist overhears comments in a therapy session indicating the child has been physically or sexually abused, the therapist is *legally* required to make a CPS (Child Protective Services) call after the session and make a report on what was heard, at least where I live. The AI bot can't do that."
Why could an AI not do that, or at least notify the person in charge? You offer no explanation as to why you believe that.
byxevioso ( 598654 ) writes:
Because the person making the call has to be licensed. The person making the call has to describe the situation in detail, which might include reading subtle facial expressions, determining if what was said was said in jest or had context around it indicating it was not a joke, or that the child's life might be in danger now or as part of ongoing abuse. The therapist might be told they need to call multiple people, and be reachable via phone, and even potentially be a witness in a court of law under certa
byfullgandoo ( 1188759 ) writes:
You sound like an "about to be out of work licensed therapist"!
Why can't the chatbot "describe the situation in detail", include "subtle facial expressions", determine "if what was said was in jest . . ."
Why can't the chatbot call multiple people?
You're grasping at straws. Keep changing the goal posts. It won't matter.
Parent
twitter
facebook
byxevioso ( 598654 ) writes:
Because the person making the calls and doing these things, *by law* has to be a person. That is why. Are you having problems reading today?
byfullgandoo ( 1188759 ) writes:
You seem to be fixated on one (one in a multi-million) outlying use-case! First of all, a meatball (human) can make a final call after detailed and precise explanation by the chatbot. And before you start another rant about how it isn't possible because of the "law", the "law" isn't set in stone. Also, the law isn't worldwide.
There may be more comments in this discussion. Without JavaScript enabled, you might want to turn on Classic Discussion System in your preferences instead.
Slashdot
●
●
Submit Story
It is much harder to find a job than to keep one.
●FAQ
●Story Archive
●Hall of Fame
●Advertising
●Terms
●Privacy Statement
●About
●Feedback
●Mobile View
●Blog
Do Not Sell or Share My Personal Information
Copyright © 2026 Slashdot Media. All Rights Reserved.
×
Close
Working...