What About Privacy? - The Google Guys

The Google Guys: Inside the Brilliant Minds of Google Founders Larry Page and Sergey Brin

Chapter 8 What About Privacy?

It ’s not a matter of whether or not someone’s watching over you. It’s just a question of their intentions.

—Randy K. Milholland, webcomic pioneer

The other important decision Google’s top three executives argue about behind closed doors is how to keep their users’ personal information confidential. People want, and deserve, access to information about government officials and other public figures, but everyone wants their own information kept from prying eyes. On this issue, Larry and Sergey come down like a falling rock on the side of confidentiality. They face vigorous criticism from outside advocates.

Google collects an enormous amount of information about the people who use its services, perhaps more than any other company in history. Google’s computers track what ads users click on, what items they search, what sites they go to, even the topics they write about in e-mails. The information is used to enable Google’s computers to figure out how to deliver more relevant data—and ads—to every individual. It’s not all about PageRank anymore.

This doesn’t mean someone at Google is reading people’s e-mails. Hundreds of millions of people use Google’s services, and humans can’t possibly be up to that task. It’s all done by Google’s computers without human intervention. But the data is there on Google’s computers, vulnerable to government subpoenas and, perhaps, to clever hackers intent on thievery.

To Larry and Sergey, collecting such data is all part of the process of making Google’s services useful. But people are suspicious of their intent. Google could, for example, sell information to advertisers and other companies, just as magazines sell information about their readers to others. There is no evidence Google has ever done this, and Larry and Sergey promise the company never will—at least not as long as they’re in charge.

Explains Larry: “Whenever you do a search, you’re trusting us to give you the right things. We take that very seriously. We have a pretty good reputation in that regard. I think people see us as willing to take positions that some might find weird initially. But we definitely can explain why we did it, and we’re up front about that.”1

But if any government hits Google with a legal subpoena, the company has to turn over the data requested. Technically, such data do not identify an individual, but merely the Internet address of a user’s computer. But the data can be used to find individuals—or at least those whose computers were used—by digging up the owners of the computer at that address through other means.

Privacy advocates and certain governments press Google to delete this information after a short period of time. Governments do so naively, believing that the potential threat is from Google’s abusing the information; but it actually means it will make it harder for the government to snoop out illegal activity. The dispute Larry and Sergey have with deleting data is how long that period should be.

Wong says it comes down to one other reason Google collects user data: it helps determine traffic patterns that may identify cybercrooks, enabling the company to prevent similar attacks in the future. In this case, Wong again argues that preserving the data is the best way to serve users. “Focusing on users is not just delivering the best and most robust service, but also delivering a service that the user will trust. A component of that trust is about privacy because this is such a data-driven service.”

Can You Trust Google?

User trust is an absolute necessity for Google, and violating it—or even the perception of violating it—threatens the company’s future competitiveness. It’s also one of Wong’s responsibilities. As part of her job, she meets regularly with product developers, starting at the early design stage, when a new product is still just a sketch on a whiteboard. She asks the developers what information they plan to collect with the product, how they plan to use it, with whom the data will be shared, and how it will be kept secure. The issues are then discussed with Larry, Sergey, and Eric, and their recommendations are designed into the product.

One of the mandates is to inform users rigorously about how the information is used. When someone downloads Google’s toolbar, they have to click on a box in order to allow PageRank to collect data about their surfing habits. If they do so, a privacy notice pops up to explain how the information is used, with a heading in bold red letters that says: “PLEASE READ CAREFULLY: IT’S NOT THE USUAL YADA YADA.”

Larry and Sergey set the mandate that users must be provided with transparency—a clear explanation of what the company is up to—and the choice of whether or not to allow it. Even if the users decide they don’t want their data collected, they can still download the toolbar without activating PageRank. That bucks the trend still prevalent for many Internet companies that says, essentially, “If you don’t accept the terms then you can’t use the product.”

Similarly, when using Google’s instant-messaging system, Google Talk, users can choose the option of going off the record, which prevents Google—and the people chatting—from keeping any record of the conversation.

Still, most people do not choose the privacy options, and Google keeps the data. But it has learned to compromise. Originally, Google did not put any limitation on the length of time it would keep the data. Privacy advocates complained, both to Google and to government regulators. So, in 2007, Wong began a series of meetings with the product designers, asking them why they needed such data. The logs improve searches, spell-check features, and other services. But a big part of the reason they keep data, said the developers, was to ensure the security of the networks.

Any Internet company is subject to spam, fraud, and attacks known as “denial of service,” in which a site is inundated with dozens of automatically generated requests in order to slow it down or make it crash. By keeping the data, Google can identify any patterns that led up to an attack and the computers used to instigate it, using the information to prevent similar future attacks.

“The fact of the matter is that the person successfully attacking us today has probably been trying for two years,” says Wong. “So when we go back into the logs for a substantial amount of time, we’re able to detect the pattern we have today. We can figure out all of the patterns we’re seeing in an attack. We ask, ‘What’s the next step of this attack? What’s the best way for us to try and stop it?’ It is a historical record that helps us get to the answer that we need today.”

Wong then asked the developers how long they actually needed to keep the data in order to maintain security. After collecting that feedback and reporting to Larry and Sergey, Google set its policy: it would keep the data for only eighteen months, “anonymizing” it after that so no individual or computer could be identified.

This placated no one. So Google engineers kept working on the problem to see if they could cut that retention time. The end result was that in 2008, Google announced that it would reduce the length of time data were kept by half, to nine months. “Our engineers said that they thought they could still get pretty good results, pretty good robustness, pretty good security, based on nine months,” says Wong. “And I’ll be really candid, we’re giving up some quality, some ability to get really good search results faster, because we don’t have that historical record. But we think we’ve built a system now, and improved our analytical tools, that will get us what we really, really need in nine months.”

This presented another dilemma that forced Larry and Sergey to compromise. To them, user experience is of paramount importance, and the complaints about privacy were overblown. There has never been a documented case of Google violating its users’ privacy, either losing control accidentally or engaging in such practices as selling information to spammers. Every complaint about Google falls into the category of what could happen, not what has happened. In general, Google has done a better job of protecting people’s privacy than its competitors. In August 2005, the U.S. Department of Justice issued subpoenas to Google, America Online, Microsoft, and Yahoo to turn over two months’ worth of search queries and all the URLs in their indexes, to aid in the Bush administration’s defense of an Internet pornography law. Every company except Google quickly complied with the subpoenas. Google challenged it in court as a fishing expedition with no probable cause. In the end, Google managed to restrict the information to fifty thousand Web addresses, and did not have to turn over information on the key words users were using to search.

In fact, most Internet users worry about privacy less than they should and do little to protect themselves from it. But they do want the companies they deal with to provide the protection for them, and they rely on privacy advocates to force the issue on their behalf. The privacy advocates and government regulators raise the complaints with input from an extreme minority of users. This means that by cutting the data-retention time, Larry and Sergey accepted a compromise they feel is not really necessary, and sacrificed some of the user experience as a result, which can be seen as a violation of Google’s promise to focus on the needs of its users.

Wong says that acceptance comes from the need to maintain the public trust. The public complaints from privacy advocates affect the overall public view of Google. “It’s a tough balance,” she says. “But if we don’t do something to respect our users’ privacy, that can be just as deadly as delivering a poor service. If the public thinks we’re selfishly keeping data, that we’re at risk of a hacking attack and all of this data is stored there, then we’re going to lose their trust and they won’t come no matter how good the service is.”

That may still be a problem. Many privacy advocates, including the well-respected Electronic Frontier Foundation (EFF) and the European Union, still want Google to cut the retention time to six months. Wong says it’s not possible without seriously compromising security and user experience.

Nevertheless, Wong says the company works well with organizations such as the EFF, where her point person is one of her former colleagues. All their recommendations are taken to the team, including Larry and Sergey, for discussion.

But a large number of complaints also come from people who express their opinions in the press without ever having spoken with Google or tried the products in question. Wong says she gets her fair share of complaints from people who don’t really seem to want to solve problems, but “just want to cause a harangue.” Some of the complaints are just crazy, and are dismissed with anger. One advocate, for example, called her and argued that Google should not retain any users’ search histories at all, even though users frequently use them to find sites they’ve visited in the past that they want to visit again. The advocate’s suggestion: users should just write down on paper the URLs of sites they’ve visited, keeping them safe from snooping eyes, or at least those who don’t have access to their desk drawers.

Wong insists that Google is not just another company that creates a product and tosses it to the corporate attorneys to sign off on. “Privacy here is a concern for everyone, from our engineers to our executives. That’s a really unique environment to work in. I don’t think people always understand that aspect, or maybe they don’t believe it, because there are certainly enough companies out there as a counterexample. But I believe it’s true, and this is my group, so . . .” She trails off with a laugh.

Will these discussions continue as Google grows bigger and more powerful? If nothing else, Larry and Sergey will be forced to maintain this kind of structure to acknowledge and deal with complaints. If they compromise their ideals in the name of profit, they are likely to be less successful. Their ideals are what make Google stand out and engender trust from millions of people. Wong points out that when Google was young, it did not follow the example of other Internet companies, which regularly and deeply mined data about their users, sometimes selling the information to advertisers. “We didn’t need to do that,” she notes, “and we’re wildly successful.”

But on these issues, public perception is everything, and Google is struggling to convince users it’s doing the right thing. Every move it makes as it grows more powerful is scrutinized and criticized. Telemarketers, spammers, and junk mailers have created huge skepticism about how any advertising megalith will use the data at its disposal, and there seems to be little Google can do about this.

Jim Barnett, the CEO of the online advertising firm Turn, sums up the sentiment: “The truth is that Google does care about user privacy and tries to be thoughtful about how it uses data. But they’re also extremely competitive and are not bashful about trying to gain leverage in this competitive marketplace, which is what the DoubleClick acquisition was all about. It’s a real concern. Google has extraordinary power in the online advertising space and the lion’s share of search, and is growing every quarter. Advertisers are in the business of monetizing data, and Google has unique access to the data.”

Trust as a Competitive Edge

Google’s competitors are not bashful about trying to exploit those concerns. In late 2008, Microsoft and Yahoo announced their own promises to become better corporate citizens than Google when it came to privacy. Microsoft threw the ball in Google’s court on December 8 by offering to cut the time it keeps user data to the EU-recommended six months—provided that Google and Yahoo did the same. Yahoo, which had the policy of retaining data for thirteen months, then took another shot on December 17 by saying it would cut retention time to just three months, except under limited circumstances. Ask was already offering a service that lets people opt out of any data retention, much as Google does when people download its toolbar option.

This now leaves Google looking like the bad guy, rather than the innovator trying to trim data retention as much as possible while minimizing the impact on the quality of its services. And it sets up a battle: which issue do users care more about—privacy or the quality of their searches?

These issues hurt. The “Don’t be evil” company has rapidly risen to astounding name brand recognition and respect from consumers because of its policies. As Google gets bigger and as publicity over its policies spreads, its reputation is beginning to suffer, and its public stance against evil is becoming a liability.

Two organizations—TRUSTe, an online privacy advocate, and the Ponemon Institute, a think tank dedicated to privacy and data protection issues—have conducted an annual award for Most Trusted Company for Privacy since 2004. They survey more than six thousand U.S. consumers to collect opinions about which companies people feel are the most trustworthy and which do the best job of safeguarding personal information. From the beginning, Google consistently held the ranking as number ten or eleven on the list, except for two anomalies—2005 and 2008—when it dropped out of the top twenty altogether. In 2005 Google was getting a lot of publicity about its plans to enter the market in China. By contrast, Yahoo made it into the top twenty for the first time in 2008, into the fourteenth spot. Since Yahoo has had at least as many troubles as Google on the privacy and censorship front, the likely explanation is that supposedly non-evil Google gets vastly more publicity and criticism.

Fran Maier, CEO of TRUSTe, which monitors companies’ online privacy policies and how well they’re being enforced, acknowledges that the concern is largely fear of the unknown. “With Google, you have a company that pushes the envelope in a lot of different ways. They may be coming up with stuff that a lot of us have never seen before.” Maier also emphasizes that while neither Microsoft nor Google made the top twenty, they are still “way up there” on the list. “A lot of people feel comfortable with both of them.”

Larry Ponemon, the author of the study, told the San Francisco Chronicle that Google and Microsoft are suffering from “big company syndrome.” Simply put, he said, “People figure that if you’re big and collecting data, there must be an issue.”

Google, it seems, is getting too big for the public’s own good, and as a result, is increasingly seen as a bad wolf. Larry and Sergey do not like the idea of making goodwill gestures (e.g., cutting data retention time to three months) that they do not believe really serve their users’ best interests. They assume that their logic will win out. This puts them in danger of losing the public relations war.

Google has more to lose than any of its competitors if it slips up. Such is the bane of anyone who publicly swears he will never be evil.