ADVERTISEMENT

When Big Data Marketing Becomes Stalking

Can data brokers be trusted to regulate themselves?
Woman being followed



Image Credit: iStock/Alessandro de Leo

SA Forum is an invited essay from experts on topical issues in science and technology.

Many of us now expect our online activities to be recorded and analyzed, but we assume the physical spaces we inhabit are different. The data broker industry doesn’t see it that way. To them, even the act of walking down the street is a legitimate data set to be captured, catalogued and exploited. This slippage between the digital and physical matters not only because of privacy concerns—it also raises serious questions about ethics and power.

Last week, The Wall Street Journal published an article about Turnstyle, a company that has placed hundreds of sensors throughout businesses in Toronto to gather signals from smartphones as they search for open wi-fi networks. The signals are used to uniquely identify phones as they move from street to street, café to cinema, work to home. The owner of the phone need not connect to any wi-fi network to be tracked; the whole process occurs without the knowledge of most phone users. Turnstyle anonymizes the data and turns it into reports that they sell back to businesses to help them “understand the customer” and better tailor their offers. In the example the WSJ described an Asian restaurant learned that many of its customers go to the local gym, so it made workout tank tops emblazoned with the restaurant logo. It should have read: “My life is being tracked by big data marketers, and all I got was this lousy T-shirt.”

Prominent voices in the public and private sectors are currently promoting boundless data collection as a way of minimizing threats and maximizing business opportunities. But this trend may have quite unpleasant consequences. In another recent example Mike Seay, a customer of OfficeMax, received a letter from the company that had the words “Daughter Killed in Car Crash” printed on the outside of the envelope following his name. He had not shared this information with OfficeMax. The company stated that it was an error caused by a “mailing list rented through a third-party provider.”

Clearly this was a mistake, but it was a revealing one. Why was OfficeMax harvesting details about the death of a child in the first place? What limits, if any, will businesses set with our data if this was deemed fair game? OfficeMax has not explained why it bought this list or how much personal data it contains, but we do know that third-party data brokers sell all manner of information to businesses, including “police officers’ home addresses, rape sufferers, and genetic disease sufferers” as well as suspected alcoholics and cancer and HIV/AIDS patients.

In the absence of regulation there have been some attempts to generate an industry code of practice for location technology companies. Most recently, a coalition headed by the Future of Privacy Forum and including Turnstyle and other retail location marketers released an agreement establishing some benchmarks for consumer privacy. The code suggests that companies de-identify personal data, limit the amount of time it is retained and prevent it from being used for employment, health care or insurance purposes. But the code only requires opt-out consent—that is, giving your details to a central Web site to indicate that you don’t want to be tracked—when the information is “not personal.”

There are three problems with this approach: The first is that almost everything is personal. In the words of computer scientists Arvind Narayanan (Princeton University) and Vitaly Shmatikov (The University of Texas at Austin), “any information that distinguishes one person from another can be used for re-identifying anonymous data.” That includes anonymous reviews of products, search queries, anonymized cell phone data and commercial transactions. The second problem is that the opt-out-via-our-Web-site model compels customers to volunteer yet more information to marketers. Finally, it is far from clear whether industry self-regulation will ever be sufficient. Most industry models of privacy assume that individuals should act like businesses, trading their information for the best price in a frictionless market where everyone understands how the technology works and the possible ramifications of sharing their data. But this model simply doesn’t reflect the reality of the deeply unequal situation we now face. Those who wield the tools of data tracking and analytics have far more power than those who don’t.

In the debate over privacy a narrow focus on individual responsibility is not enough. The scale of the problem far exceeds the individual: it is systemic. We are now faced with large-scale experiments on city streets where people are in a state of forced participation, without any real ability to negotiate the terms, and often without the knowledge their data is being collected.

We need a sweeping debate about ethics, boundaries and regulation for location data technologies. This is particularly urgent given that both the private and public sectors are gathering this data, often in concert: this week we learned that spy agencies are extracting personal data from mobile phones through “leaky apps.” An honest discussion will begin with a recognition that the system is now acutely skewed in favor of the data collectors, and that this power imbalance needs to be addressed directly.

Rights & Permissions
Share this Article:

Comments

You must sign in or register as a ScientificAmerican.com member to submit a comment.
Scientific American Dinosaurs

Get the
latest special collector's edition, Dinosaurs!

Limited Time Offer!

Purchase Now >

X

Email this Article

X