Artificial Intelligence is infiltrating its way into a lot of industries. And now crisis call centers are looking at ways to incorporate the technology to better help those in need.
Category
🗞
NewsTranscript
00:00 Artificial intelligence is infiltrating its way into a lot of different industries.
00:04 I found out crisis call centers are looking for ways to incorporate the technology to
00:09 better help those in need.
00:11 And it got me thinking about how AI could even be a part of mental health.
00:16 It's not exactly what I thought.
00:17 The National Institute of Mental Health just awarded a $2.1 million grant to a tech company
00:23 called Listen in partnership with Protocol.
00:25 A national crisis call center answering calls on behalf of hundreds of organizations, including
00:30 those made to the 988 helpline.
00:33 The money will be used in a trial to see how AI can improve crisis calls.
00:38 Is your first interaction going to be with a chatbot or is it going to be with a human?
00:42 Yeah, no, it's with a human.
00:45 It's business as usual.
00:47 The person who's contacting the crisis call center won't even know that this technology
00:52 is being used because it doesn't impact the call in the interaction at all with the call
00:58 responder.
00:59 How it works now is a supervisor listens to those recorded calls.
01:03 So this technology would eliminate that extra work because AI would now be assessing the
01:07 calls made to a helpline, not live, but after the fact to determine how well that crisis
01:13 call went.
01:14 There are certain things that the crisis responders are supposed to do.
01:17 They're supposed to ask questions about whether or not someone's having suicidal thoughts
01:21 and if there was any kind of history of suicidal behavior.
01:24 And then they're supposed to be empathic, right?
01:26 They're supposed to kind of keep someone engaged.
01:28 I was able to connect with a volunteer crisis counselor.
01:31 Her name is Jen Bollinger, and I asked her what she thinks about all this.
01:35 I think there's space for it.
01:36 I think that there is, but I don't think that it can fully replace like internal review.
01:40 I feel like it text based conversations.
01:42 It's there's more it's more applicable.
01:44 And I think that it would be more useful because it's just evaluating a written conversation.
01:50 Some critics point to problems AI has caused with behavioral health services.
01:54 The National Eating Disorders Association or NEDA removed its chat bot called Tessa
01:59 after it says it learned that it had been given generative AI functions without our
02:04 knowledge.
02:05 These functions cause the chat bot to give dangerous advice to chat participants.
02:09 To be clear, this trial doesn't have any interaction between the person calling or texting for
02:13 help and AI and taking AI out of the equation.
02:17 It's evident more resources are needed to help crisis call centers keep up with the
02:21 demand within the first year of operating.
02:24 The crisis helpline 988 had nearly 5 million calls, texts and chats.