EXPERT OPINION Phantom Privacy |
Notes from the Undergrad Kindling a love of reading?
|
|
By Joseph Turow | Toward the end of summer, The Wall Street Journal published an article about a woman who’d been let in on a secret that is increasingly a fact of life for everyone who uses the Internet. A company had deployed a tiny file in her computer called a cookie that enabled it to track her movements online, collecting information ranging from her favorite movies to comments typed on various websites. The resulting profile did not contain her name, but was “eerily correct,” as she put it, about many other things—her age and hometown being just the beginning. As such, it allowed the company or its clients to target her with commercial messages based on a set of characteristics so specific it verged on being unique to her. I disagree. The supposed relevance of commercials is far outweighed by activities that are sowing the seeds of broad social discrimination in the marketplace, undermining people’s trust that companies with such data will interact with us fairly, and reinforcing a sense that the government cannot protect us when we can’t protect ourselves. Moreover, the claim of anonymity in all this is meaningless. The emerging new world is dramatically different. Instead of large populations and population segments as audiences, advertisers now expect media firms to deliver to them very particular types of individuals—and, increasingly, particular individuals—with a detailed level of knowledge about them and their behaviors that was unheard of even a few years ago. Special online advertising exchanges, owned by Google, Yahoo, Microsoft, Interpublic, and other major players, allow for the auction of individuals with particular characteristics, often in real time. In fact, through cookie-matching activities, an advertiser can actually buy the right to reach someone on an exchange whom the advertiser knows from previous contacts and is now tracking around the web. With these activities has come a new vocabulary that reflects potentially grave social divisions and privacy issues. Marketers talk about people as targets and waste—that is, individuals dismissed as not relevant or useful. Increasingly, they offer individuals different products and discounts based on ideas marketers have gleaned about them without their knowledge. These social differentiations are spreading from advertising to information, entertainment, and news, as media firms try hard to please their sponsors. Marketers also use words like anonymous and personal in ways that have lost their traditional meaning. If a company can follow your behavior in the digital environment—and that potentially includes your mobile phone and your television set—its claim that you are anonymous is meaningless. That is particularly true when firms intermittently add offline information to the online data and then simply strip the name and address to make it “anonymous.” The business arrangements that use this new language are transforming the advertising and media landscapes. Companies track people on websites and across websites with the aim of learning what they do, what they care about, and whom they talk to. Firms that exchange the information often do keep the individuals’ names and postal addresses anonymous, but not before they add specific demographic data and lifestyle information. Here are just two examples: There are many great things about the new media environment. But when companies track people without their knowledge, sell their data without their knowledge or permission, and then decide whether they are, in the words of the industry, targets or waste, we have a social problem. A recent national survey I co-conducted with colleagues at Berkeley Law School showed emphatically that Americans don’t want this type of situation. If it’s allowed to fester, and when they begin to realize how it pits them against others in the ads they get, the discounts they receive, the TV-guide suggestions and news stories they confront, and even the offers they receive in the supermarket, they will get even more disconcerted and angry than they are now. They will further distrust the companies that have put them in this situation, and they will be incensed that the government has not helped to prevent it. A comparison to the financial industry is apt. Here was an industry engaged in a whole spectrum of arcane practices not transparent to consumers or regulators that had serious negative impact on our lives. It would be deeply unfortunate if the advertising system followed the same trajectory. We must move from the current marketing regime that uses information with abandon—where people’s data are being sliced and diced to create reputations for them that they don’t know about and might not agree with—to a regime that acts toward information with respect. That is where marketers recognize that people own their data, have rights to know where all their data are collected and used, and should not have to worry when they travel through the media world that their actions and backgrounds will cause them unwanted social discrimination regarding what they later see and hear. Until recently, I believed that educating the public about data collection and giving them options would be sufficient to deal with privacy issues related to advertising. I have come to realize, though, that Americans don’t have and will not acquire the complex knowledge needed to understand the increasing challenges of this marketplace. Opt-out and opt-in privacy regimes, while necessary, are far from sufficient. To help the public, Congress should recognize that certain aspects of this new world raise serious consumer-protection issues and act with that in mind. One path is to limit the extensiveness of data or clusters of data that a digital advertiser can keep about an individual or household. Some industry organizations resist such suggestions, depicting scenarios of Internet doom if Congress moves forward with privacy regulations regarding digital platforms. But in the face of Americans’ widespread concern about the exploitation of their data, a level regulatory playing field in the interest of privacy will actually have the opposite impact. It will increase public trust in online actors and set the stage for new forms of commercial competition from which industries and citizens will benefit. Joseph Turow C’72 ASC’73 Gr’76 is the Robert Lewis Shayon Professor at the Annenberg School for Communication. This essay is adapted from testimony he gave to the Senate Committee on Commerce, Science & Transportation in July. |
||
©2010 The Pennsylvania
Gazette Last modified 8/25/10 |