- Secure America Now/YouTube
- Facebook and Google employees helped promote anti-Islam ads to voters in swing states. The ads violate the norms of political advertising you’d see in traditional media. Big tech platforms that spread political advertising need to fix the problems now and be more transparent.
As the story of ad abuse on Facebook, Google, and Twitter during the US 2016 election has unfolded, there’s been one common theme: Russian bots, fake accounts, and other shadowy actors took advantage of automated online advertising platforms to influence American voters.
But with the latest revelation, Facebook and Google can no longer hide behind the convenient excuse that everything that happened was the algorithm’s fault.
According to a Bloomberg report Wednesday, Facebook and Google employees worked with the conservative nonprofit group Secure America Now to showcase ads that contained anti-Islam rhetoric.
One of the ads that ran, according to Bloomberg, was the one you can watch below. It shows an alternative reality where France is run by Shariah law. The ad was targeted to people in swing states during the 2016 US election, the report says.
It’s unclear if all the ads from the group were similarly over-the-top. But this ad weakens the argument that big tech platforms are just dumb vessel’s for other groups’ messages. Google and Facebook consciously decided this was appropriate content to accept payment for and to help promote. Beyond the question of whether or not political ads need to be regulated on tech platforms, we’re now faced with questions about what Google and Facebook’s standards are when working with a group that wants to use their powerful tools to target voters.
Is hate speech OK? Is fear mongering OK? We have no idea what Facebook and Google’s standards are beyond the fact that Secure America Now and the media company it hired, Harris Media, were willing to pay up.
Until now, one of Facebook and Google’s biggest defenses was that the ad abuse on their platforms were happening under their noses, and the kinds of ads specifically designed to promote polarizing topics came from fake accounts and other malicious forces trying to game the system.
Innocent bystanders or accomplices?
Wednesday’s Bloomberg report blows a hole in that argument, and shows that the very heart of Facebook and Google’s ad businesses can cater to anyone with enough money to spend, no matter what the message may be. I can’t think of a bigger sign that Big Tech doesn’t care about what runs on their platforms beyond literally helping spread such distasteful messages when paid to do so. Facebook even helped Secure America Now experiment with new video ad formats it was testing, according to the report.
Google told Bloomberg it removed some of Secure America Now’s ads, and Facebook executive Andrew Bosworth told me on Twitter that employees didn’t work on creative for the ads. But they still helped target and promote the ads on their respective platforms, which hardly makes them innocent in spreading such vile messages.
To be clear, this is different than Facebook “embedding” members of its ad team with the Trump campaign last year to help it target ads better. As my colleague Mike Shields pointed out last week, that’s a normal service that Facebook has offered political campaigns from across the political spectrum.
Instead, this is an example of Facebook and Google catering to an advertiser to promote and target ads that easily fall outside political advertising norms in traditional media.
The threat of regulation for political advertising still looms large for Facebook, Google, and other tech companies. Senator Mark Warner is expected to present a bill on that very matter soon. Facebook has already promised to be more transparent about political advertising on its platform, and Google and Twitter have indicated they’re willing to make changes too.
Now we know they have to take it a step further and be transparent about their level of involvement spreading the same type of polarizing messages previously attributed to Russian hackers and bots.