The market in health related apps has ballooned in recent years, but there remain significant doubts about the reliability of the apps, and their ability to help the neediest of patients. A recent study from McLean Hospital and Harvard Medical School also highlights the security issues concerning health apps.
The authors analyzed a number of apps designed to help dementia patients, and their caregivers. The analysis found that few of these apps have robust security policies, with many not having data security policies at all.
A note of caution
The authors strike a note of caution for users of such apps. They believe their research highlights the importance of educating consumers on data issues and how they can make the best choices about their use of technology such as these apps.
The researchers analyzed around 125 apps aimed at the dementia market. They focused specifically on apps that collected user-generated data, and evaluated the privacy policies of these apps against criteria assessing the way data was handled.
It transpired that just 33 had privacy policies available to users, with the policies that were available often inadequate.
“There was a preponderance of missing information, the majority acknowledged collecting individual data for internal purposes, and most named instances in which user data would be shared with outside parties,” the authors say.
Indeed, such is the parlous state of security in the sector, the authors don’t believe any user should assume that privacy and security measures are in place. Whilst they recommend users doing more to educate themselves on how their data is being used, this is incredibly challenging in areas such as dementia where it seems likely that privacy policies will not be understood.
There is undoubtedly tremendous potential for health apps to help both patients themselves but also their caregivers, but the authors urge greater attention to be given to the security and privacy of the data that such apps often generate and collect.
“Clinicians should educate themselves and their patients about issues related to the data collected before recommending an app,” the authors say. “If they don’t, it’s akin to prescribing a medication without knowing the potential for risks or side effects.”