Emotional AI: Is it ethical for people to get attached to AI “therapists”?
Emotional AI: Is it ethical for people to get attached to AI “therapists”?

Emotional AI: Is it ethical for people to get attached to AI “therapists”?

I’ve been building some simple conversational agents for mental health support and it amazes me how deeply people bond with them. Some say it’s helped them more than years of talk therapy. It freaks me out a bit, is it just fancy journaling, or are we opening a door to emotional dependency on machines? Would love to hear dev & user perspectives.

submitted by /u/wsymphony
[link] [comments]