There are all kinds of panic buttons. Our wristwatches detect slips-and-falls and call 911 for us. Recovering patients and senior citizens can talk with medical teams and emergency responders instantly. One slap of an Alexa button summons Tide or Lucky Charms from a nearby Amazon warehouse.

Clearly, not all events are emergencies. But there are some types of very real distress which tend to fly under the radar — either because they’re not easy to detect or because we don’t know how to talk about them effectively.

Mental distress is one of these. It’s still a relatively under-discussed and misunderstood branch of medicine. So what if there was a way for patients to signal mental distress as readily as they can the physical variety?

Keeping Minds Strong

MindStrong

This is the idea behind Mindstrong — one of the latest startups in Silicon Valley that has health care on the brain. They’re not just any company, though — one of the founders is a former director of the National Institute of Mental Health (NIMH). Their goal is to create a tool that serves as an early detection system — a “fire alarm” — for impending emotional crisis.

Over the course of a year, Mindstrong and another company, 7 Cups, joined with patients, health professionals, and county and state officials to test a suite of smartphone apps designed specifically for individuals seeking or currently receiving treatment through public health channels. In addition to the app itself, the system extends to an alternate keyboard for smartphones as well as a profile that captures information about how patients use their phones.

This first battery of tests was designed around treating borderline personality disorder. Those who suffer from the disorder typically struggle to identify when their own levels of distress are elevated. As a result, the focus of the technology “under the hood” at Mindstrong is to provide “evidence” for a patient’s ongoing emotional states in the form of “biofeedback.” According to the researchers, the app uses the phone’s hardware to identify triggers, behaviors, topics, individuals, tasks and other stimuli which provoke reactions in the patient — and then offer these insights for further unpacking during behavioral, dialectical and cognitive therapies.

Unlike the glut of other apps promising wellness advice or accurate diagnoses, Mindstrong was developed under the guidance of health officials and the state of California, which earmarked $100 million for the five-year study. The money came from tax revenue gathered under California’s Proposition 30 (a “millionaire tax”), which, in 2012, raised taxes on very wealthy residents to provide additional funding for health, education and public safety.

Whereas some apps make claims of being able to detect schizophrenia, depression and other disorders, usually without substantial evidence that they’re effective, Mindstrong’s claims are relatively more modest: it’s for patients who’ve already sought help, but who stand to benefit from ongoing insights into the habits and behaviors that fuel their destructive cycles. The company calls it “digital phenotyping.”

New Ways to Ask for Help

MindStrong App

Mindstrong and projects like it seek to provide our most vulnerable and under-served communities with the means to detect early signs of trouble and then gently encourage them to reach out for help. Patients with bipolar disorder and other insidious conditions which make life livable, but challenging, all over the world, could benefit greatly from the gentle tap on the shoulder that this app provides. Even something like a contentious encounter like a co-worker, which many might be tempted to shove under the mental “carpet” and move on, is something the app will pick up on — and then provide suggestions so the patient can better anticipate triggers and their associated emotions later on.

As with any project like this, there are unanswered questions about how popular such a service would be outside of the test group, and whether the mainstream could overcome their privacy concerns long enough to take part. Using it means transmitting personal data, even if the app itself — logic and algorithms — is the one doing much of the analysis. Even Alexa isn’t adding to her health care repertoire without first vetting brands for HIPAA compliance, so it goes without saying that patient confidentiality has to be a top priority moving forward.

Assuming privacy and security controls are implemented well-communicated to users, however, it seems there’s little reason why projects like this one shouldn’t move forward in pilot programs all across the country. However — Dr. Thomas R. Insel, one of the founders of the project and a neuroscientist and psychiatrist, said: “the program may have to fail at first.” That means, despite the group’s promise that they can already “tell you’re depressed before you do,” it could take years for public opinion and the regulatory framework to come together sufficiently for a product like this to become more widely available. Clinical trials continue apace, however.

As of October 2018, published research behind Mindstrong was “coming soon,” according to company representatives. They indicate they’ve tested their app against the “gold standard” tests for detecting depression, anxiety and other signs of mental distress. But there are holes in the story for now.

Nevertheless, it’s great to see projects like this one seek to tackle mental health in a novel way. Access to mental health care services is a huge unmet need in the United States. And California is actually one of the better states where the topic is concerned, considering that providers there endeavor to schedule first appointments within at least 15 days. But problems persist across the country, such as 17% of adults with mental illness remaining uninsured and more than 56% of adults receiving no treatment for known mental health issues. It’s quite likely the average American doesn’t even have mental health coverage as part of their insurance plan.

The human mind is a mystery, but it’s more than clear that keeping it healthy lies at the core of maintaining good holistic health. With technology doing some of the detection and maintenance work for us and making it easier to reach out for help, and with the right expansions in our public health programs, we should see mental health care become more widely available in time.