The Digital Age has produced some remarkable technology and will continue to do so, but as each new innovation springs forth, more of us are finding that we are not as comfortable with it as we once thought we might be.
One of the latest examples comes from a new iPhone app that appears capable of gauging one’s mental health — though how accurately it can do so remains questionable, as does what authorities might do with such information.
As reported by The Wall Street Journal:
Toward the end of Janisse Flowers’s pregnancy, a nurse at her gynecologist’s office asked her to download an iPhone app that would track how often she text messaged with friends, how long she talked on the phone and how far she traveled each day.
The app was part of an effort by Ms. Flowers ‘s health-care provider to test whether smartphone data could help detect symptoms of postpartum depression, an underdiagnosed condition affecting women after they give birth.
The app was developed by Ginger.io, Inc., a San Francisco-based firm that says it compared data from Flowers and nearly 200 other women against answers given for a weekly survey used to diagnose depression. The firm’s analysts said they found that behavioral patterns like decreased mobility and lengthened phone calls became linked with poor overall mood in the surveys.
Designed to track and flag behavior
“It’s very creepy to think someone can tell your mood” based on smartphone data, Flowers, who gave birth to twins last year, told WSJ. However, she added, “I felt like this was something that was going to help me while I was in a vulnerable place.”
The app is one of a new generation of health-surveillance technologies that are being employed by doctors, healthcare providers, hospitals themselves and even health insurance companies. Like fitness trackers, such as “FitBit,” which record running distances and burned calories, the new apps and other technological tools measure the volume of text messaging, the tone of voice in calls and other behavioral patterns to get a sense of a patient’s psychological condition. Doctors say mental health has a strong link to physical health.
“Health insurer Aetna Inc., for instance, says it uses voice-analysis software on some telephone calls to get people who receive short-term disability benefits back to work sooner,” WSJ reports.
Adds Ginger.io CEO Anmol Madan: “There are four billion phones on the planet, and it turns out they’re incredibly powerful diaries of a person’s life.”
But how accurate are such apps, really? And if they’re not all that accurate, at least at this time, is it a good idea for health providers and insurers to rely on them so soon? Many do.
As WSJ noted:
Ginger.io’s app, called Ginger.io, is being used by 30 medical centers, including Kaiser Permanente and the University of California, San Francisco, the company says.
What’s more, taxpayers are on the hook now as well.
The National Institutes of Health has awarded $2.42 million in grant money to researchers at the Harvard School of Public Health who are developing a smartphone app designed to examine and assess factors like when patients lock and unlock their phones, in order to determine sleeping patterns in those diagnosed with psychiatric problems.
Not all are ready to trust the technology
Also, researchers at the University of Michigan are working on an app that will record and analyze patients’ speech and voice patterns during calls to gauge whether they are near depression or mania.
A number of apps are aimed at treating mental health conditions. However, other patterns — like when someone may suddenly stop calling family members or begins staying inside their home for a week — can also serve as potential signs of encroaching behavioral issues.
Still, many hospitals and doctors are skittish about trusting the apps — especially when it comes to the use of data which they will collect.
“I wonder how companies are going to reassure people that when they download an app that can track everything they’re doing, the data will never be used against them,” Dr. Timothy G. Ferris, an internist and senior vice president of population health management at Partners HealthCare — Massachusetts’ largest healthcare provider — told WSJ.
Others are concerned that such technologies may be employed by health providers and even police, eventually, to deprive people of their firearms. In many states, those diagnosed with mental health issues are not permitted to own guns, but if the diagnosis is incorrect, then some could be unduly punished.
Such apps are “going to create a bunch of false positives until they get really, really good at the algorithms,” Ferris told WSJ.
Sources: