Let's start with a quick Google experiment.
On a computer you normally use, click on this link. Even if you're not logged in to a Google account, you should see a basic profile Google has built around you based on your search history: an age range, gender, a few categories of interests. It's all pretty innocuous stuff, right?
But Google has hundreds of data points, which the company uses to build much more detailed profiles of its users — though most of the Google-using public doesn't know what that profile looks like. The question is: should we?
Ryan Paul, an editor at technology website Ars Technica, says Google scans all of your email if you're a Gmail user and tracks all of the videos you watch if you're a YouTube user — but that's not all.
"This is really just part of what they track, because they're also collecting a lot of data on the back end, including search queries, which they keep for 18 months," he says.
Google, whose unofficial corporate motto is "Don't Be Evil," mainly uses this data for targeted ads. But this past week, it announced changes to its privacy policy in order to merge user data across its various services: Gmail, YouTube, Google Calendar, Google+ and Google search. According to the company, it's so it can create "a beautifully simple, intuitive user experience across Google."
The changes go into effect March 1, and you can choose to opt out just as you can currently opt of Google's personalized browsing, but that will limit the usefulness of many of Google's services and even Android phones, which run on Google's operating system.
But Google insists there is nothing to fear and the company will protect your privacy.
"There is a lot of different ways ... that we can use information to improve your services," says Rachel Whetstone, Google's Senior Vice President for Public Policy. "Whether it's better spelling correction, whether it's enabling you to add things from your Gmail to your calendar, whatever it might be, it's all really about you and it's about your information."
Google Vs. Congress
Google's new policy is getting a lot of negative attention on Capitol Hill, in part because the company now allows kids as young as 13 to sign up for its services. This means Google can, in theory, build a profile of a user over several decades.
This worries Rep. Ed Markey of Massachusetts, a senior Democrat on the House Subcommittee on Telecommunications and the Internet. He tells weekends on All Things Considered host Guy Raz that what he finds most objectionable is that users don't seem to have much say in all this.
"It is imperative that users will be able to decide whether they want their information shared across the spectrum of Google's offerings, even if Google thinks they can make a profit by doing that," Markey says.
With the exception of the news hitting tech blogs and websites, there hasn't been much public outrage over Google's privacy policy change. But Markey says that's because the public lacks a full understanding of the policy.
"I think there's a ticking time bomb here of public outrage about what Google and other companies are trying to do," he says. "Part of this debate is getting it elevated."
Markey says he is also concerned that Google is not giving parents the option of protecting their children from having their data used to market to them.
Markey and Rep. Joe Barton, R-Texas, are working to bring attention to the issue.
"I think once we win on children, we'll win on the other issues," he says.
The Google Ecosystem
Following Google's privacy policy announcement, Chris Dawson, a contributing editor at ZDnet and an admitted Google junkie, wrote about how he's ready for the company's new policy. His closing statement: "Welcome to 2012, folks."
Dawson tells NPR's Raz that he has chosen to opt in.
"I've chosen to use the Google ecosystem in a lot of ways, and I find that it makes my life easier," Dawson says.
For Dawson, a recent search for electric guitars triggered targeted ads when searching other sites, which led to a better deal. That's pretty simple, he says, but it ultimately can lead to greater things when integrated with Android phones and location-based GPS services.
"Google knows that every Friday night I search for a pizza place," he says. "So I'm hoping that [eventually] on my Android phone my GPS will trigger an alert when I happen to be driving past a well-reviewed pizza place that just happens to have a Google offer available."
That sounds good in theory, but Lori Andrews, author of I Know Who You Are and I Saw What You Did, tells Raz that there's a danger in assumptions made about a person based on what he or she does on the Web.
"It may turn out that if other guitar players renege on paying off their credit cards, the next time Chris searches for a credit card, he won't be able to get one at a good rate," Andrews says.
Another example of a problem with targeted ads is when someone, for instance, searches for a drug with which to commit suicide or talks about it on a Google chat room. Andrews found that ads for deals on that drug are often displayed, similarly to Dawson's deals on guitars, which could actually encourage the negative behavior.
Google has told its users it will not sell their data to the highest bidder, but what if that were to suddenly change? Markey says he doesn't trust Google, saying there "is no basis for that trust."
Dawson disagrees, and says there is a basis to trust Google: money.
"This is what actually gives me some faith in Google — capitalism," he says. "Google has done a pretty good job of being this trusted broker, and the minute Google steps aside from that role ... they lose that trust and then they lose the ability to sell ads."
Selling ads is how Google has generated billions of dollars in profit. And with that much riding on trust from its users, let's hope it will be enough to keep Google from "being evil."
Copyright 2021 NPR. To see more, visit https://www.npr.org.