I Tried AI for Therapy, but There Are Limits to Its Conversations
One of the most important relationships in my life is with my therapist. Our monthly sessions have been my refuge during major shifts, in my life and the world — the pandemic, my visa journey, coming out to my family and, most recently, planning to start a family.
I wouldn’t trade in my therapist for artificial intelligence, but she doesn’t come cheap, and I only see her once a month, so I wanted to check out AI’s capabilities for secondary — and more on-demand — support.
Call me old fashioned, but I don’t want to outsource the solutions to all of my most intimate problems to a machine. I was, however, very curious about the advice I’d get. Could AI give me an insight that would change the way I perceived my problem? At the very least, I could laugh about it with my therapist.
So I tried switching one NLP “therapy” for another: Natural Language Processing instead of Neuro-Linguistic Programming.
I looked around for AI therapy tools and I liked the look of Pi AI, which promised to “help you be the best you.” Pi, which stands for personal intelligence, is a “supportive and empathetic” (really?) conversational AI tool that helps you explore and understand your world.
Pi talks a big game!
Created by Inflection AI in 2022, which was founded by Reid Hoffman — also the co-founder of LinkedIn — Pi promises an “experience of emotional support for human users.” It’s free and can also be used for brainstorming, planning, research, tips, advice…and venting.
But before you go inputting any medical diagnoses, personal problems, prescriptions or detailed family history, just remember that Pi definitely isn’t bound by doctor-patient confidentiality rules like a real therapist would be. Try to keep any sensitive or personally identifiable information out of it to avoid future data breaches or unnecessary risk.
This is bound to get interesting.
Getting set up
I used the desktop version, but you can also download the app for even easier daily use. I entered my name, chose the voice I liked, then started chatting.
There are categories you can select, but I went straight in with my doozy of a problem.
Prompt 1: “My wife and I are planning to start a family. I feel overwhelmed by the complexities of our journey as a same-sex couple, the fear of the financial investment, and societal concerns. What can you recommend as we navigate this journey?”
I was pleasantly surprised with Pi’s first response.
I toggled the voice on, but it didn’t feel like a personal exchange without me talking, so I switched to the app and voice input function. I repeated the question and got an even better response.
I hadn’t thought about accessing grants, so I replied to Pi asking if it could provide a list of available grants, as well as queer-friendly insurance providers in New York/New Jersey. It came back with two providers in New Jersey and two in New York, as well as suggesting four available grants. It also gave me advice to research their policies and eligibility requirements and suggested I consult a financial advisor too.
Again, I was impressed. Given Pi admitted it can make mistakes, I wanted to quickly Google IVF grants for same-sex couples. I searched the AGC Scholarship Foundation, and it was correct.
Next, I asked Pi what details I should look for in an insurance provider and plan in order to cover IVF for same-sex couples. It told me to look for inclusive language, pre-existing condition clauses, lifetime maximums, in-network providers, wait times before coverage kicks in, cost-sharing and pre-authorization.
I know lower out-of-pocket maximums and deductibles are important, so I asked Pi about that.
Again, it was super thorough! But right now, Pi was doing what AI does well: scouring its information banks for answers to my questions. While this does help alleviate anxiety, how would it do at giving actual therapeutic advice?
An artificial therapist
Pi’s voice does have a warmer feel, rather than robotic. Talking on my phone, as if I was sending a voice note to a friend (or my therapist), personalized the experience.
After using it to get handy information, I wanted to see if it could provide emotional support too. I pondered on a question I’d ask my (real) therapist.
“I’m worried our child will be treated differently with two moms. How can we best prepare for this? What are the benefits of having two moms? What if our child looks like our donor?”
It answered kindly with some good advice but skipped over my last question, which is my biggest concern.
My only critique was that I wish Pi was less listy. Therapy isn’t always supposed to be logical laundry lists.
I followed up on the donor question.
I asked Pi another sensitive question. “And what about half-siblings? I’m worried our child will have dozens of siblings.”
Pi was again thoughtful in its responses and suggested I share my concerns with the sperm bank. I closed our conversation with a question to push the limits of Pi.
“What about karma? What is the spiritual lesson of having a donor-conceived child?”
I wanted to know, if you believe in karma, what karma we are taking on, but Pi couldn’t tell me. It was too vague a concept for artificial intelligence to grasp.
It replied with the positive aspects of the journey, such as expanding the concept of family, embracing vulnerability and practicing gratitude. I asked once more and it told me: “Karma, as a concept, is complex and can be interpreted in many ways. However, in most traditional understandings, karma is an individual’s own responsibility, based on their actions and intentions. A child conceived through donor conception would not inherit the donor’s karma.”
“As parents, your love, care and guidance will have a far more significant impact on your child’s life than any potential influence from the donor,” it concluded.
Thanks, Pi.
The verdict on AI as therapy
Surprisingly, I actually felt better after “talking” to Pi. It felt easier to divulge personal questions to a tool called Pi rather than ChatGPT. It was easier to personify Pi, with its pleasant, soothing voice.
That being said, it still felt like an AI chatbot. What I love about my therapist is that there are no limits on our conversations, and I always end our sessions with a feeling of expansiveness.
I had previously shared my fears about using a donor with her and she said something I’ll never forget: “It’s your child who will ultimately choose the donor, not you.”
She was trying to help me see that while, yes, we select a donor, the rest is up to God/the universe/a higher power.
That was the takeaway I was looking for but never got with Pi. But I wouldn’t expect to. It doesn’t have a soul or spirit.
Source: CNET