PERSONAL IDENTITY MANAGEMENT - Be Accountable - Hacking Happiness: Why Your Personal Data Counts and How Tracking It Can Change the World (2015)

Hacking Happiness: Why Your Personal Data Counts and How Tracking It Can Change the World (2015)

[PART]

1

Be Accountable

3

PERSONAL IDENTITY MANAGEMENT

I’m excited about where technology will take us. My biggest goal is to make sure that our privacy laws keep up with our technology. I want to make sure that all of the benefits that we see from new technology don’t come at the expense of our privacy and personal freedom.1

SENATOR AL FRANKEN

SENATOR AL FRANKEN is chairman of the Senate Subcommittee on Privacy, Technology, and the Law, a bipartisan part of the larger Senate Judiciary Committee. It’s a complex job to encourage growth of technology while honoring the nuance of consumer privacy. From the technology side, it’s easy to dwell on how privacy advocates may hinder innovation and growth. From the privacy side, a loss of trust from previous violations combined with a lack of understanding about technology slows adoption.

Both sides have merit and need to be heard. But the issues need some context:

· People’s right to privacy is different from a person’s preference about privacy.

· Just because a certain technology can be built doesn’t mean it should be.

Let’s unpack these ideas a bit.

Privacy is tough to both define and measure. Depending on the context, the activity that’s fine for one person may not be condoned for another. For instance, as a rule, most adults don’t have a problem with the idea that websites collecting information from children under the age of thirteen should comply with the Federal Trade Commission’s Children’s Online Privacy Protection Act (COPPA), which requires verifiable parental consent for PII, or Personally Identifiable Information to be collected about children. Gathering this PII for younger kids outside of the parameters of parental consent is typically seen as creepy or worse. COPPA also covers ideas of how cookies or other tracking mechanisms should or shouldn’t be utilized to collect behavioral data on kids.

But manipulating online systems of age recognition can be easier than you think, especially when parents help kids under the age of thirteen get onto sites. As the Huffington Post reported in their article “Under 13 Year Olds on Facebook: Why Do 5 Million Kids Log In if Facebook Doesn’t Want Them To?” a Consumer Reports study conducted in June 2012 revealed that “an estimated 5.6 million Facebook clients—about 3.5 percent of its U.S. users—are children who the company says are banned from the site.”2 Surprisingly, many of the kids creating accounts are also getting help from their parents, according to the study.

Here’s where things get tricky: If a site can’t collect PII data about a user, it’s very hard to identify their age. And Facebook does regularly eliminate the younger users it identifies. The article also notes that Facebook could lose upwards of 3.5 percent of its U.S. market, however, if it were more vigilant at keeping kids off the site.

In light of this article, I’d like to restate my second issue from above with a little tweak:

· Just because a certain technology hasn’t been built doesn’t mean it shouldn’t be.

Facebook has bigger priorities than creating technology that can accurately identify if a person is genuinely under the age of thirteen. That’s not a question—if 5.6 million users might be under the age of thirteen and Facebook isn’t actively creating a technology to ban them as mandated by the Federal Trade Commission, by definition their priorities are clear. The fact they’d stand to lose 3.5 percent of their U.S. market if kids were bumped from the site also speaks to their priorities.

Parents helping underage kids to game the system are acting on their personal preferences. The fact remains, however, that parents are breaking the spirit of COPPA when they help kids under thirteen get on Facebook and that the company stands to benefit when these kids join the site. And now those kids will start getting tracked earlier, with their data being utilized or sold in ways they don’t realize.

The fact that Facebook hasn’t built a technology to accurately identify if someone is under thirteen doesn’t mean they can’t. And as they’ve created the largest pool of photographic data in the world identifiable by facial recognition technology, I think they’d be able to block kids better if they wanted. Their facial recognition technology launched as opt-out only (versus having users take the extra step to opt in), implying they don’t want users to be able to opt out because it messes with their ability to monetize.

It’s this lack of clarity around privacy that is fostering distrust from users and helping to create the personal identity management industry.

The Context of Data

Data is like your health. You don’t really appreciate the way that data is being handled until something bad happens to you.3

—WILLIAM HOFFMAN, director of the World Economic Forum’s Information and Communications Technologies Global Agenda

Unlocking the Value of Personal Data: From Collection to Usage,4 written by the World Economic Forum (WEF) in collaboration with the Boston Consulting Group, provides an excellent overview of the evolution of modern data collection and practices. The report was generated as a result of a series of global workshops conducted by the WEF over the course of a number of months in 2012.

One of the biggest difficulties about data collection has to do with the context of what it will be used for. For instance, in measuring health data regarding a particular disease or condition, knowing personal information about individuals in a trial will lead to greater insights than by using anonymized data. So a practice of always separating people’s identities from the results of their trials or other contexts can hinder innovation.

There is also a technical issue with anonymization that any data scientist will remind you of: As a rule, it’s often impossible to achieve. In an effort to demonstrate the need for consumer privacy, famed Carnegie Mellon researcher Latanya Sweeney showed that 87 percent of all Americans could be uniquely identified using only three bits of information: their zip code, birth date, and gender.5

For these and other reasons, WEF’s report calls for a shift from controlling data collection to focusing on data usage. A primary reason for the shift is the evolution of Big Data, which refers to the exponentially large sets of information that need to be aggregated and studied before even knowing if they contain potential for insights. As the WEF report notes:

Often in the process of discovery, when combining data and looking for patterns and insights, possible applications are not always clear. Allowing data to be used for discovery more freely, but ensuring appropriate controls over the applications of that discovery to protect the individual, is one way of striking the balance between social and economic value creation and protection.6

The distinction between data collection and usage is of huge importance. If you’re able to protect your data to the point where no one can access an iota of your identity without your permission, how someone wants to collect it becomes irrelevant. It’s the equivalent of a robber wanting to steal your money from the bank: Without your providing the key, your currency stays where you want it.

Forrester Research’s report Personal Identity Management: Preparing for a World of Consumer-Managed Data, by Fatemeh Khatibloo, reflects the growing trend of people wanting to own and control their data. Khatibloo points out that consumers are beginning to better understand how marketers are making money off their data, and they’re keen to learn how their data is being collected and used.7

The term “personal identity management” also reflects the need for consumers to shift from a complacent to a proactive stance regarding their digital identities. One key tool for them to leverage this shift is the rise of “data vaults” or “lockers” or “personal clouds.”

A difficulty in how information gets shared about you has to do with the context of who is asking for your data and how long they need it to accomplish a mutually established goal. If you were able to collect in one place all of your personal information and data you generate as you use your devices, and control how and when it gets shared and under what terms and conditions, however, you’d be employing the mentality of a data vault.

Another key element with a vault is the idea of destroying data after a certain time limit or when it’s used outside of the context the person sharing it intended it to be used. Reminiscent of the Mission: Impossible encoded spy message that self-destructs after being read, this idea of temporary data usage has caught on recently through the photo-sharing app Snapchat, where users allow a set time limit for pictures to be viewed by recipients. While the app has suffered from some users learning how to store photos longer than users intended, the trend of data destruction catching on could be a very positive one for consumers overall. People will begin to understand that a great way to protect their data is to only provide it to trusted parties and to enable it to be destroyed if they feel it’s being used in ways they didn’t permit.

Nobody’s Vault but Your Own

“Some of the most open people, people who say they’re open, change their minds when they learn how much of their data is beyond their control,” said Shane Green, CEO of Personal, a company focused on helping users “take control of the master copy of their data” with services focused on protection and flexibility regarding digital identity about privacy in the digital age. “People are being digitized.”8

Green is focusing on using a carrot-and-stick method to get users used to the idea of data vaults. One of the company’s most popular offerings is a service called Fill It, which lets users auto-populate sign-in forms securely online. This provides a sense of how vaults work overall, as users see they’re in control of their data. As Green noted in his interview for this book:

People feel more protected when their data is protected. When you fill out a form, you’re “turning over the goods.” You’re signing away your privacy and terms of use. This is the pain point everyone has. When you solve that problem, people see why they need a set of reusable data.9

The company also features a unique Owner Data Agreement on their site that turns personal users into owners. The agreement is a contract making users the legal owners of the data they store with the service. Overall the site provides a powerfully motivating message that reinforces the need for consumers to understand how precious their data is and take charge of it in a proactive way.

“I went to Harvard Law School and I can’t understand most terms of service agreements,” noted Michael Fertik in an interview for Hacking H(app)iness. Fertik is CEO of Reputation.com, a leading provider of online reputation products and services. In reference to data brokers or other sites not wanting to make it easy for consumers to understand byzantine terms of service agreements, Fertik says, “As the saying goes, if you’re not paying for the product, you or your data are the product.” While Fertik doesn’t feel companies selling data are necessarily mendacious in nature, he points out, “They simply don’t care about your privacy. Between the right thing and mammon, mammon will win."10

Fertik sees a future of protected consumer privacy via data vaults or similar services as being inevitable. Besides being ethically sound, there are powerful economic motivators for companies as well as individuals to innovate and evolve a personal data economy. A primary aspect of this motivation will come in flexibility of how companies will obtain consumer data in the future. Fertik outlines four primary types of exchange he envisions, where companies will provide the following to consumers in exchange for the right to access their data:

1) Coupons or discounts

2) Special privileges based on status (While this could come in the form of airline points, an evolution of this idea would be for influence or reputation points to be portable between airlines.)

3) Cash, virtual or real (“virtual” meaning virtual currency)

4) Privacy (People can make their purchases without revealing any data for a fee.)

These four examples provide a pragmatic approach to monetization in the personal identity management economy. While the model may cause anxiety for some brands reliant on purchasing customer information, the real people set to suffer in this system will be data brokers disintermediated in the process. For brands, this will provide more direct access to their customers and more opportunity to establish value-added relationships.

If consumers begin trading or selling their data in this type of model, brands also won’t have to work as hard to advertise. Consumers will become masters of their own data and be able to meaningfully engage with brands that they want to interact with to create opportunities for mutual value creation.

John Clippinger, cofounder and executive director of ID3, a nonprofit headquartered in Boston whose mission (according to their site) is to “develop a new social ecosystem of trusted, self-healing digital institutions,” has created the Open Mustard Seed (OMS) Framework as an open-data platform that helps users take and keep control over their personal information. Along with ensuring trust between users and anyone trying to access their data, Clippinger feels having user data in this form of “pool economy” means you can create new markets. As he explains in “Power-Curve Society: The Future of Innovation, Opportunity and Social Equity in the Emerging Networked Economy,”11reverse auctions (where users provide their data for sale instead of others using it without their knowledge or consent) could create “enormous efficiencies” in the future. Think about a model like Craigslist, where people list items they want to sell, specifying their own prices and specifics around the interactions. These data pools could provide that protected infrastructure.

The Personal Data Ecosystem Consortium, where Kaliya (aka Identity Woman) is executive director, is also creating an infrastructure that is helping to change the data market and make it real. Almost one hundred companies around the world are working on tools to help people collect, manage, and get value from their own data and build new ethical data markets. By joining the consortium, companies have to make a commitment to give people the rights to their own data and work toward interoperability so people will not be trapped by a particular provider.

If the personal data economy becomes widespread, consumers will also manage access rights to their data directly with one another. In a world of conscious consumerism, this might come in the form of an eBay power seller wanting to gain access to data about prospective clients from them. Or someone trying to establish credibility for renting their apartment on Airbnb may buy data from popular renters who are happy to monetize their expertise.

People will also start to earn passive revenue from watching TV while allowing advertisers to monitor their responses to shows using sensors or other technology. A precedent for this model can be seen from existing online video streaming sites. If you want to watch a thirty-minute video, some sites will offer the opportunity to watch a longer pre-roll ad to see the whole show uninterrupted. Or you can watch a number of shorter ads and have some breaks in viewing where you can’t fast-forward over an ad. But the point is, you’re given a choice.

Transparency combined with options in which people feel their data is protected means commerce can flourish because there’s no need to hide shady business practices.

The Connected Choice

I think we see privacy violations where there are genuine gaps between what companies think is acceptable and what consumers expect in terms of privacy. For example, most consumers think that when they give an app permission to collect their location information, it’s only that app that will get that data.

I think consumers would be shocked if they learned that half of top apps turn around and give or sell that data to third parties. Yet this is a standard business practice on the Internet—and many companies are sincerely surprised when privacy advocates raise it as an issue. So a lot of my work is simply shedding light on these practices and trying to bridge these two worldviews.12

—SENATOR AL FRANKEN

Hacking H(app)iness means breaking down old ideas of what can bring contentment in the Connected World. Fulfillment will only come when you recognize that your data is your own. It’s an extension of your identity. If other people are collecting it, you should know what they want to do with it and be a part of the transaction if you so choose.