Everybody has something they consider private. I think of it not as an absolute, but as a sliding scale. We all have stories about (typically young) people who share everything online in a very open way, seeming to not care about potential ramifications. But, I suspect even they have information they’d rather not share. There are others who eschew any online presence entirely because they feel all of their information should be private. Importantly, there is no right or wrong here; it is a personal matter.
And, there’s the old adage: pay attention to what people do, not what they say (bear with me on what this has to do with privacy — and connected health). How many times have you heard that? The root of this pronouncement lies in the fact that our brains color perception in so many complex ways that we report information that is often at odds with reality. Logic would dictate then that accurate, independent measurement that reflects reality is a better source of truth than asking any individual for their opinion.
Isn’t that what wearables, mobile phones, and sensors do? It is! Thus, if I want an accurate, quantitative representation of your life so I can help guide you to a healthier future, I want access to all of the ‘digital dust’ that you leave behind every day without thinking about it — GPS data from your phone, mobile purchasing habits, step counts, number and frequency of outbound messages, etc. If I could get access to all of that information and create a unique persona of you, I’d have a better chance of guiding you to a healthier state than if I just asked you for answers to a handful of questions. The challenge lies in how an individual defines what parts or pieces of that data should be considered his/her ‘private information.’
The recent Cambridge Analytica debacle was a wake-up call for me. Until then, I had the attitude that if people were so concerned about privacy, they must have something to hide. Now I see what having your private information lying around like ‘loose change’ can do. It IS a real challenge. What are we to do? Rely on inadequate, self-reported data to make decisions that guide an individual to improve health or convince them to share data that they may consider private.
There are four examples I know of where companies are flirting with this dilemma in the marketplace. For a few years now, United Healthcare has been giving people up to $4/day if they meet certain activity goals — as measured by a wearable. Walgreens, via its Balanced Rewards program, offers customers in-store savings if they meet certain activity targets. Oscar Insurance and Humana offer similar programs through their partnership. Thus, a small subset of consumers is getting used to trading private information for a financial, health-related reward.
I wonder if the recent noise about Facebook and Cambridge Analytica is changing consumer ideas regarding these programs. Will enrollment enthusiasm cool?
Then there is the inevitable conversation as to whether health data is really different. There have been numerous data breaches (credit card hacks is just one example), and yet people continue to shop online, bank online, book travel online. The risk, and the convenience, is worth it to most people. The biggest fear that sets health data apart is the fear that an insurer will deny coverage based on these data. The Affordable Care Act should have eased these fears by prohibiting plans from denying coverage of preexisting conditions. Our current political uncertainty undoubtedly raises anxiety.
I am not aware of any insurer who collects wearables data and then uses the information to increase an individual’s cost of care. With these early experiments, insurers are focused on rewarding individuals for healthy behavior. Could we see a future where companies funding the cost of care use creative benefit design to shift more costs to individuals demonstrating poor health behaviors? I think so. It would show up as a higher baseline premium, while maintaining rewards for those achieving healthy behaviors (i.e., the consumer showing poor choices would not feel punished).
In my book, The Internet Of Healthy Things, we talked about a number of these issues, including a vivid description of a future where I trade lots of personal information that enables a personal health coach to guide me to healthier behaviors, in exchange for a premium reduction.
There has been a lot of talk recently about Facebook’s business model. Some have suggested offering a subscription service which allows users complete privacy. The company has rejected this notion, probably because selling targeted ads is far more lucrative than any subscription model they can envision. By way of analogy to the healthcare insurance world, a future where you remain largely anonymous to your insurer is likely to be very costly to you. Although insurers are not selling ads, they will get similar value out of knowing about you so they can target programs to you that have the best chance of improving your health.
I firmly believe that it is critical — in support of lowering healthcare costs and improving health outcomes and provider productivity — that we begin to synthesize these passively generated health data and use them to drive individual health recommendations. Data breaches like the Cambridge Analytica example are scary and will set back public opinion on this topic.
The industry is taking notice. The recent OpenMedReady framework developed by Qualcomm, Philips and others is one example. Maybe if we work together, we can thread the needle and get this one right.