Day 40 — Is more data always better data?
Taking a hard look at the data we collect from users.
One of the big insights from the Data Security work I’ve been doing these past few weeks has been just how quickly data and systems multiply. Leave a founder alone for a few months and they’ll replace friends and fun with a healthy dose of SAAS and new product features.
Without caring about data security from the very beginning, my work over the last week has revealed just how easily a well-intentioned product can go off the rails and reach total creeper status.
For example: our list of vendors today stands at 40 — with our overall systems numbering about 60. Each has a specific purpose and intention, but it also means we have exponentially more ways that something could go wrong.
Bringing Awareness to Data Privacy
In the latest iOS Update, Apple started requiring developers to publish a “Privacy Label”. Think nutrition label, but for apps. It’s Apple putting a renewed focus on Privacy and exposing just how much data is being discretely stolen from us as we use ever-useful apps and services.
It made me think of the poor consumer walking down the grocery aisle in 1994 only to discover Spam is actually pretty horrible for you.

As it turns out, we’ve been consuming apps like Spam: we’ve enjoyed them for years without ever actually understanding what’s in them.
Introducing: Apple Privacy Labels
The part I love about this approach is it gives users data and a choice. Of course you can still buy Spam, potato chips, or whatever food item you want, but at least you have the information to make an educated decision.
How we stack up today
I ran through this exercise today as I considered our data privacy. Here’s where we stand today with our app:


First Reactions
With the exception of Device ID (we use this for notifications, but it may not be needed), each piece of data brings a direct user benefit and purpose. But it’s easy to imagine a world in 1–2 years where this isn’t the case: when we start collecting data just to collect data (or worse, using it for purposes we don’t make users aware of).
Truth be told, I’m on the fence whether “data linked to you” is a feature or a bug. On the one hand, it enables us to deliver personalized features that can make a big impact on kids living with chronic health issues. On the other hand, it’s a risky set of data to manage and I’d rather focus on delivering value than being arbiters of security.
In either case, it’s really nice having a point-in-time benchmark. It allows us to scrutinize every piece of data we collect and question the user benefit it actually provides. There is a cost to every piece of data we collect — a cost which manifests in ever-increasing liability.
The Path Forward
In case you haven’t noticed from these daily blogs, I value transparency. I’d much rather expose our flaws knowing that we can then address problems head-on.
In the end, putting a focus on privacy and security from the very beginning has been the silver lining of all these hours of Data Security work. It’s opened my eyes to what it will take to be a great company that puts privacy first.
And in the world of health, that’s something worth fighting for.