The company previously recognized the project as important. But now its blocked.
The company previously recognized ProPublica's project as "important." Now it's blocked. Getty Images

Last year, The Stranger announced we were partnering with ProPublica in its effort to daylight the secret targeting choices that create political "dark ads" on Facebook.

Such ads were used by Russians during their operation to influence the 2016 US presidential election, and they've been used here in Seattle's municipal elections, too.

The ProPublica-led project sought help from Facebook users around the country in building one big database containing the targeting choices made by purchasers of American political ads. One hope was that such a database could allow the entire electorate to find out when a particular slice of voters is being targeted for manipulation and disinformation. But now that transparency effort has been halted by a deliberate change that was recently—and rather quietly—implemented by Facebook.

The change, according to ProPublica, "comes a few months after Facebook executives urged ProPublica to shut down its ad transparency project. In August, Facebook ads product management director Rob Leathern acknowledged ProPublica’s project 'serves an important purpose.' But he said, 'We’re going to start enforcing on the existing terms of service that we have.'"

Facebook's rationale for doing this is that a ProPublica-designed web browser plugin—one that allowed participants all over America to automatically collect and send in the targeting information for Facebook political ads that were aimed at them—is the kind of plugin that can create information safety issues.

But the reason such plugins were developed by ProPublica and others in the first place is that Facebook does not publicly disclose the specific targeting information behind political ads, even in the company's somewhat new and much-hyped (by Facebook) political ad archive.

When ProPublica recently asked Facebook why it doesn't just include political ad targeting information in its political ad archive, the company said that doing so “could expose people’s information.”

As ProPublica notes, Facebook "didn’t elaborate on how that might happen."

But last year, as Facebook and Google lobbied the Washington State Public Disclosure Commission in an unsuccessful attempt to stop this state from requiring digital platforms to disclose local political ad targeting information, lobbyists for Google offered a different rationale.

Disclosure of political ad targeting information, Google wrote, could "force commercial advertisers to reveal confidential, strategic information about campaigns' operations."

Political campaigns, the company warned, might prefer to just stop buying online political ads altogether "rather than face public disclosure of their confidential, campaign strategies."

In other words, the people Google was expressing concern about were customers—specifically, those customers who pay for political ads.

Perhaps when Facebook now expresses concern that sharing ad targeting details "could expose people's information," it has this same, relatively small group of people in mind.