Utah’s teens are facing a big problem when it comes to their mental health — too few counselors to meet the demand.

One Utah-based company says artificial intelligence can help bridge the gap between counselors and students. But some therapists say this technology has limits.

Mele Tua’one is starting her freshman year at West High School, and she knows it’s going to be a big adjustment from middle school.

“I’m just overwhelmed because the school is pretty big,” she said. “I feel like it’s going to be hard figuring out my way around.”

If she was feeling down and there weren’t any counselors to speak to, she’d feel comfortable opening up to an AI chatbot.

“I feel like if I don’t have anyone to talk to, that’s a good place to turn,” Mele said.

Her mother, Alexia, said if there aren’t any available counselors, an AI chatbot is better than nothing.

“If they don’t have anything, then their minds can take them to scary places,” Alexia said.

One such service is ElizaCHAT. Company CEO David Barney said that’s why the chatbot was created, because there aren’t enough counselors to meet the needs of Utah’s students.

“Every single person studying psychology and social work right now, if every single one of them were to become a licensed therapist, we still wouldn’t even be coming close to meeting demand,” he said.

Barney said they don’t rely on the internet to feed information into their AI. They only rely on what doctors tell them.

“We have training models built by our clinical advisory board of licensed and trained psychologists and therapists. They’re building the model. We’re using the best practices of the industry,” Barney said. “When we work with our clinical advisory board, we tell them, ‘We’re trying to map your brain into Eliza’s brain.’”

Child and family psychologist Douglas Goldsmith said there may be a place for AI in therapy, but not for really big problems.

However, he said there are a lot of things that can’t be picked up if someone is just typing their problems into a chat screen.

“A machine responding to a child may miss a lot of nuance and may miss being able to ask the right questions,” Goldsmith said.

Barney acknowledged the chatbot is not meant to replace actual therapy.

KUTV asked if there was a scenario where Eliza would say, “This is too much for me.”

“Absolutely. In fact, to start out, we’re not tackling some of those deeper issues,” Barney said.