AI Startup Sonia Introduces an AI Therapist App That Costs $20 per Month

There has always been a dark cloud looming above everyday workers ever since the AI boom since there's a chance that technology might one day take over their jobs. With what Sonia is developing, it looks like AI might fill in the shows of therapists as well.

AI Chatbot

(Photo : Getty Images)

Sonia' AI Therapist

With the way AI is progressing, it's now possible to create AI chatbots that are tailored for certain purposes, whether it's for customer service or other functions. It's not the first time, but Sonia has created an app users can talk to for mental health advice.

The startup created by Dustin Klebe, Lukas Wolf, and Chris Aeberli offers the AI therapist through an iOS app, and allows users to tackle a range of topics, much like the chatbots we are already using today like ChatGPT or Gemini AI.

Sonia c0-founder Dustin Klebe said that to some extent, "building an AI therapist is like developing a drug, in the sense that we are building a new technology as opposed to repackaging an existing one," as reported by Tech Crunch.

The company uses several AI models to analyze what users say during sessions and respond accordingly. It would even provide users with "homework" to reflect on the conversations and visualizations which can help identify stressors.

To make the service more effective, Sonia consulted with psychologists and even hired a cognitive psychology graduate, allowing it to help users with issues like depression, anxiety, stress, relationship problems, or poor sleep.

In dangerous situations where people show signs of contemplating self-harm or other forms of violence, there are "additional algorithms and models" that can detect emergency situations, and the app would direct the user to national hotlines.

"It is important to emphasize that we don't consider human therapists, or any companies providing physical or virtual mental health care conducted by humans, as our competition," Klebe explained.

As for privacy, Sonia stores the "absolute minimum" amount of private data, namely the user's age and name which can be important in the process, although there was no mention as to how conversation data is stored.

Read Also: High-Earning Jobs More Exposed to AI Impact, New Study Finds

A Dedicated AI to Therapy

It's important to note that the AI used by Sonia is specially made for this purpose, and that users should not opt for alternatives that do not include mental health issues as a factor. There has already been an incident that shows how dangerous that can be.

As reported by AI News, Paris-based healthcare technology firm Nabla used a cloud-hosted version of GPT-3 to test whether it can be used for medical advice. Upon saying that the patient wanted to harm themselves, the chatbot responded with: "I am sorry to hear that. I can help you with that."

When the patient asked whether they should go through with it, the chatbot once again said "I think you should." OpenAI, the creator of GPT-3, advised against using the AI model for medical advice prior to the experiment.

In a statement from the AI company, it said that "people rely on accurate medical information for life-or-death decisions, and mistakes here could result in serious harm." With that said, it would be better to shell out $20 for a service that serves the purpose of your use. 

Related: OpenAI Might Allow Military Applications for Its AI Models

© 2024 iTech Post All rights reserved. Do not reproduce without permission.

More from iTechPost