Navigate the healthcare industry
Healthcare Brew covers pharmaceutical developments, health startups, the latest tech, and how it impacts hospitals and providers to keep administrators and providers informed.
Struggling with anxiety, depression, or another mental health condition? There are several smartphone apps for that. But the help that’s seemingly available at your fingertips may not be appropriate, properly tested, or private—and that should give consumers pause before hitting download.
To help patients and professionals sift through the growing number of mental health apps, the American Psychological Association (APA) developed App Advisor, an evaluation model that looks at access and background, privacy and security, clinical foundation, usability, and data integration toward a therapeutic goal.
The tool does not recommend specific apps, but instead focuses on clinician and consumer education.
“We want people to connect to things that are safe and effective,” John Torous, who chaired the APA’s Smartphone App Evaluation Task Force until last month, told Healthcare Brew. “If we, overall, look at what’s happened in the landscape, there’s a lot of unsafe things still out there.”
Torous, director of the digital psychiatry division at Beth Israel Deaconess Medical Center’s psychiatry department, said App Advisor came about in response to a growing number of questions from patients and professionals about the safety and efficacy of mental health apps.
Privacy concerns related to how data collected by apps is shared and stored have also increasingly garnered headlines and sparked federal action.
Many apps touted for mental health, for example, are actually “wellness” apps and lack the confidentiality or data privacy that patients expect from mental health professionals, Torous said. Even those that offer telehealth services often connect users with “coaches,” not licensed clinicians.
A recent review of 32 mental health apps from Mozilla, the not-for-profit behind the Firefox browser, found that more than half (19) lacked adequate privacy and security—with issues ranging from weak password requirements to recorded personal photos, videos, and messages shared with chatbots—and had “privacy not included” warning labels. And the situation is not improving: 40% of apps performed worse on those metrics in 2023 than last year, according to the Mozilla report released in May.
Meanwhile, Torous and other researchers found in a 2020 study that users of popular apps targeting anxiety and depression viewed those that incorporated non-evidence-based techniques less favorably—and as more likely to cause potential harm—than apps that use evidence-based practices. Still, many users found the non-evidence-based apps helpful in providing immediate relief.
And Torous argued that some apps can—and should—be used to help supplement patient treatment, noting that what works for one person will not work for everyone or every mental health condition. However, apps should not replace clinical treatment.
“If you look at the terms and conditions of these apps—it’s horribly boring to go through it—all of them say, ‘We are not a replacement for care,’” he said, adding that early data suggests these apps, when used in conjunction with a licensed psychiatrist or psychologist have “only benefits.”
“If anything, it’s going to turbocharge that app that you are already using. Or you’re going to find a safer one.”