It’s heartening to see, in the wake of the Cambridge Analytica revelations, growing skepticism about how Facebook handles data and data privacy. But we should take this opportunity to ask the bigger, harder questions, too — questions about discrimination and division, and whether we want to live in a society where our consumer data profile determines our reality.
In the spring of 2016, a Facebook executive gave a presentation about the success of Facebook’s then-new “ethnic affinity” advertising categories. Facebook had grouped users as white, Black, or Latino based on what they had clicked, and this targeting had allowed the movie “Straight Outta Compton” to be marketed as two completely different films. For Black audiences, it was a deeply political biopic about the members of N.W.A. and their music, framed by contemporary reflections from Dr. Dre and Ice Cube. For white audiences, it was a scripted drama about gangsters, guns, and cops that barely mentioned the names of its real-life characters. From the perspective of Universal Pictures, this dual marketing had been wildly successful. “Straight Outta Compton” earned over $160 million at the U.S. box office.
When we saw this news in 2016, it immediately raised alarm bells about the effect of such categories on civil rights. We went straight to Facebook with our concern: How was the company ensuring that ads for jobs, housing, and employment weren’t targeted by race, given that such targeting is illegal under the civil rights laws? Facebook didn’t have an answer. We worked with officials from the company for more than a year on solutions that, as it turned out, were not properly implemented. Facebook still makes it possible for advertisers to target based on categories closely linked to gender, family status, and disability, and the company has recently gotten sued for it.
To make matters worse, the government is actively turning a blind eye. The New York Times reported on Thursday that, under Secretary Ben Carson, the federal Department of Housing and Urban Development dropped its investigation into whether Facebook’s ad targeting system violated the Fair Housing Act. That means that HUD, on the eve of the 50th anniversary of that law, is choosing to put its head in the sand rather than investigate whether civil rights laws have been broken.
It’s not illegal to market “Straight Outta Compton” differently based on race (as opposed to say, a housing or employment ad). Nonetheless, that tactic creates a distinction among people and treats them differently as a result. And these kinds of distinctions have real-world effects: Think about what it means to white teenagers to see a trailer with yet another image of criminal black men, instead of hearing Dr. Dre reflect on police brutality in the 1980s and today.
Then magnify that effect hundreds and thousands of times. In today’s world, a huge proportion of the advertising and media that we see reaches us based on accumulated data about us.
Targeting, of course, does enable advertisers — including the ACLU — to efficiently reach particular audiences with messages that are tailored to them, and that can sometimes be a good thing. But that doesn’t mean we shouldn’t acknowledge what’s lost with that efficiency: that people outside of the expected audiences won’t see these messages or know they exist.
Ad targeting can make the world look different to different people. Some find the web full of job ads for high-paying CEO jobs, while others see mostly ads for sneakers or payday loans. Our news also reaches us and our networks through ad targeting. How can this not have huge implications for our ability to exist in a cohesive society? How can we agree on the policies that should govern our world when there are no common reference points for what that world looks like?
It’s not just foreign interference and voter suppression campaigns that make this kind of targeting so dangerous for democracy.