The Quantified Self: Making the Personal Public

Please enjoy this, rather polarizing, cometary on the Quantified Self Movement. While I’m concerned that concepts like the “unraveling effect” run the risk of stunting innovative approaches to health and identity,  Peppet brings up critical issues that both individuals and legal systems must address as society starts to relate to digital tools in new ways. — Deb (At the time of writing this, my daily step count was 2739)

Human instrumentation is booming. FitBit can track the number of steps you take a day, how many miles you’ve walked, calories burned, your minutes asleep, and the number of times you woke up during the night. BodyMedia’s armbands are similar, as is the Philips DirectLife device. You can track your running habits with RunKeeper, your weight with a WiFi Withings scale that will Tweet to your friends, your moods on MoodJam or what makes you happy on TrackYourHappiness. Get even more obsessive about your sleep with Zeo, or about your baby’s sleep (or other biological) habits with TrixieTracker. Track your web browsing, your electric use (or here), your spending, your driving, how much you discard or recycle, your movements and location, your pulse, your illness symptoms, what music you listen to, your meditations, your Tweeting patterns. And, of course, publish it all — plus anything else you care to track manually (or on your smartphone) — on Daytum or mycrocosm or me-trics or elsewhere.

There are names for this craze or movement. Gary Wolf & Kevin Kelly call this the “quantified self” (see Wolf’s must-watch recent Ted talk and Wired articles on the subject) and have begun an international organization to connect self-quantifiers. The trend is related to physiological computing, personal informatics, and life logging.

There are all sorts of legal implications to these developments. We have already incorporated sensors into the penal system (e.g., ankle bracelets & alcohol monitors in cars). How will sensors and self-tracking integrate into other legal domains and doctrines? Proving an alibi becomes easier if you’re real-time streaming your GPS-tracked location to your friends. Will we someday subpoena emotion or mood data, pulse, or other sensor-provided information to challenge claims and defenses about emotional state, intentions, mens rea? Will we evolve contexts in which there is an obligation to track personal information — to prove one’s parenting abilities, for example?

And what of privacy? It may not seem that an individual’s choice to use these technologies has privacy implications — so what if you decide to use FitBit to track your health and exercise? In a forthcoming piece titled “Unraveling Privacy: The Personal Prospectus and the Threat of a Full Disclosure Future,” however, I argue that self-tracking — particularly through electronic sensors — poses a threat to privacy for a somewhat unintuitive reason.

I do not worry that sensor data will be hacked (although it could be), nor that the firms creating such sensors or web-driven tracking systems will share it underhandedly (although they could), nor that their privacy policies are weak (although they probably are). Instead, I argue that these sensors and tracking systems are creating vast amounts of high-quality data about people that has previously been unavailable, and that we are already seeing ways in which sharing such data with others can be economically rewarding. For example, car insurance companies are now offering discounts if you install an electronic monitor in your car that tells the insurer your driving habits, and employers can use DirectLife devices to incentivize employees to participate in fitness programs (thereby reducing health insurance costs).

Such quantified, sensor-driven data become part of what I call the “Personal Prospectus.” The Personal Prospectus is a metaphor for the increasing array of verified personal information that we can share about ourselves electronically. Want to price my health insurance premium? Let me share with you my FitBit data. Want to price my car rental or car insurance? Let me share with you my regular car’s “black box” data to prove I am a safe driver. Want me to prove I will be a diligent, responsible employee? Let me share with you my real time blood alcohol content, how carefully I manage my diabetes, or my lifelong productivity records.

All of this seems like merely (quirky) personal choice at first, particularly for those with “good” information who begin the trend by self-quantifying and then using that data to personal advantage (through discounts, etc.). But personal choice begets privacy issues if these information markets begin to unravel. Unraveling occurs because when a few people with “good” information can verifiably measure, track, and share information, everyone (even those with “bad” information) may ultimately find they have little choice but to follow suit. If all candidates for a job are willing to wear a blood alcohol monitor and you’re not, the negative inference drawn about you is obvious. If all the safe drivers quickly sign up for “discounts” that require electronic monitoring of their driving, those who refuse will quickly find themselves paying what amounts to a penalty. (For my recent post on unraveling as corporate strategy, see here.)

There are harms here beyond the pressure to consent. If you were somewhat horrified by the first paragraphs of this post — if you thought “why would anyone want to track so much data about themselves?” — the unraveling threat may particularly bother you. As Anand Giridharadas recently asked in a (short and worth watching) discussion of the quantified self movement, taken together these devices “imply an approach to life that may be something different than what we want life to be about … Because we have these things we’re just doing them, without thinking about whether we want to become the kind of people who do them.”

Your choice to quantify your self (for personal preference or profit) thus has deep implications if it necessitates my “choice” to quantify my self under the pressure of unraveling. What if I just wasn’t the sort of person who wanted to know all of this real-time data about myself, but we evolve an economy that requires such measurement? What if quantification is anathema to my aesthetic or psychological makeup; what if it conflicts with the internal architecture around which I have constructed my identity and way of knowing? Is “knowing thyself” at this level, and in this way (through these modalities), autonomy-enhancing or destroying, and for whom? What sorts of people — artists? academics? writers? — will be most denuded or excluded by such a metric-based world?

For anyone who has read Gary Shteyngart’s Super Sad True Love Story, it’s not hard to see a future in which obsessive measurement — of ourselves, others, everything — may leave some feeling reduced immeasurably by the hegemony of the measurable. Because of the unraveling effect, these reluctant late adopters may not have a choice; as many choose to quantify the self, all may have no real choice but to follow …

Scott Peppet is Associate Professor of Law at the University of Colorado Law School. His most recent scholarship focuses on informational privacy and structural changes to information architecture. This post originally appeared in the Concurring Opinions blog and The Health Care Blog.

Leave a Reply

Your email address will not be published. Required fields are marked *