Advertising, Propaganda, and De-radicalization
Good morning everyone,
Since Marketing BS launched last spring, the number of subscribers has grown steadily each month. Thank you to everyone who shares these newsletters with friends and colleagues; I really appreciate the support.
— Edward
Countering Hate with Google Ads
In last week’s post, I critiqued a New York Times article that overexaggerated the dangers of location tracking. Like many other media outlets, the Times regularly highlights technology’s negative impact on the world. A frequent topic of their coverage: the internet’s role in radicalizing people toward violent extremism.
At the end of December 2019, the Times profiled Moonshot CVE, a London-based startup that’s actually using online strategies as a positive force, by encouraging tolerance and goodwill. The article provides the broader context:
Amid an upsurge in violent hate attacks, federal law enforcement agencies and other groups have been scrutinizing online activity like internet searches to counteract radicalization.
Now a private start-up company has developed an unusual solution based on ordinary online marketing tools.
Moonshot uses a technique called the Redirect Method. Here’s how it works: when people search Google for extremist topics, the organization places (paid) ads for websites that counter violent and discriminatory ideologies. For instance, if a person searched for “KKK membership,” they might encounter:
a clip from the movie “American History X,” whose white supremacist central character undergoes a transformation after befriending a black man in prison. Former white supremacists have credited the movie with subverting their worldview.
With a database of more than 20,000 terms, Moonshot’s algorithms recognize search combinations that might indicate a person’s level of radicalism or propensity for violence. A search for “Hitler” would pass through the filter, but a search term such as “Hitler is a Hero” would trigger Moonshot’s tools.
The Moonshot mission is laudable. But will their strategy work?
The RAND Corporation published a report evaluating tactics like the Redirect Method. Co-author Todd C. Helmus stated, “They are applying what commercial marketers do every day, putting Google Ads in front of people. The innovation is applying that toward extremism.”
The conclusion of the report, however, notes that “The potentially highly radical nature of the Redirect Method’s target audience makes evaluation of the campaign particularly complicated.” In other words, the jury is still out.
What can we — as marketers — learn from the world of digital propaganda?
Selecting Friends and Electing Presidents
One important preface for this section: if you’re not familiar with the concept of psychographics, it’s a research strategy for describing consumers in terms of values, opinions, and attitudes. In contrast to demographics, which categorizes WHO people are (in terms of age, sex, etc.), psychographics attempts to understand WHY people will behave in certain ways. Given the popularity of segmentation in modern marketing, many companies are trying to build psychographic profiles of their customers.
Even people who ignore political news have heard the name “Cambridge Analytica,” even if they cannot explain its purpose or its impact. A number of high-profile media pundits, as well as a popular Netflix documentary, advanced the theory that Facebook advertising — directed by Russian interests — swung the election in Donald Trump’s favor. In most of the media coverage, Cambridge Analytica is depicted as a Rasputin-type organization, with dark powers and significant influence.
But how successful were their political marketing tools?
Let’s trace the story of Facebook info and Cambridge Analytica:
Aleksandr Kogan, a data scientist at the University of Cambridge, developed an app that generated psychographic profiles for people, based upon analysis of their “Big Five” personality traits.
Using the answers that people submitted to online quizzes, Kogan attempted to predict their connections to friends and places, along with preferences for things like TV shows.
To build a dataset for the project, Kogan gathered personal information from approximately 250,000 people (who were paid $1 to $2 to participate in various surveys).
Many survey respondents used Facebook logins, which allowed Kogan to access data about their Facebook friends. You might be surprised that this process did NOT breach Facebook’s rules (the practice is now prohibited).
In total, Kogan collected information about more than 50 million Facebook accounts.
Kogan sold the data to Cambridge Analytica, a British consulting firm (an action that DID violate Facebook’s terms of service).
By cross-referencing various datasets, Cambridge Analytica transformed the demographic information of 50 million Facebook accounts into psychographic profiles for approximately 34 million Facebook users.
In turn, Cambridge Analytica shopped the information to political organizations, claiming it could influence voting behavior with targeted ads.
So how much would someone pay for a crystal ball into the minds of voters?
The presidential campaign for Texas Senator Ted Cruz spent more than $6 million on contracts with Cambridge Analytica, who claimed its research could identify voter preferences with high degrees of accuracy. That information, Cambridge asserted, could be used to generate targeted Facebook ads, television commercials, and more. In the aftermath of Cruz’s loss to Donald Trump, people from the campaign stated that Cambridge Analytica’s psychographic data was unreliable. As reported by the New York Times:
In one early test, more than half the Oklahoma voters whom Cambridge had identified as Cruz supporters actually favored other candidates. The campaign stopped using Cambridge’s data entirely after the South Carolina primary.
After Cruz dropped Cambridge Analytica, the agency started working for Donald Trump, now the Republican nominee. Very quickly, members of Trump’s team formed perspectives that matched the ones from the Cruz campaign: Cambridge’s psychographic strategy was unimpressive. From the Times:
Tests showed Cambridge’s data and models were slightly less effective than the existing Republican National Committee system, according to three former Trump campaign aides…
Trump aides … said Cambridge had played a relatively modest role, providing personnel who worked alongside other analytics vendors on some early digital advertising and using conventional microtargeting techniques. None of those efforts involved psychographics. [Emphasis mine]
Despite criticisms from the campaign teams, Cambridge Analytica paraded their success in every possible forum. Then-chief executive Alexander Nix boasted that their work elevated Ted Cruz from obscure candidate to runner-up. More boldly, Cambridge took credit for their role in the election of President Trump:
In postelection conversations with potential clients, Cambridge has promoted itself as the brains behind Mr. Trump’s upset victory. One brochure circulated to clients this year, which details Cambridge’s expertise in behavioral targeting, also calls the company’s “pivotal role” in electing Mr. Trump its “biggest success politically in the United States.”
What can marketers learn from the Cambridge Analytica scandal?
Media stories portray Cambridge Analytica as a cabal of evil geniuses. Let’s ignore political morality and focus on the marketing techniques. The bottom line: Cambridge excelled at the sales pitch but failed at the execution. Their record provides more evidence of incompetence than genius.
The Russians are Coming
Are you surprised that Cambridge Analytica was selling sizzle instead of serving steak? For political researchers, Cambridge’s inability to identify and woo voters lined up with their own studies. Back in 2018, Joshua Kalla and David Broockman analyzed the impact of advertising on election results. Their core conclusion was stunning: “the best estimate of the effects of campaign contact and advertising on Americans’ candidates choices in general elections is zero.” In other words, political advertising is far, far less effective than most of us realize.
But we don’t need a scholarly study to prove that point; the truth is staring at us in the face. Here is a recent tweet from Nate Silver, one of the nation’s leading (and most controversial) pundits on the use of data to predict election results:
Who are the biggest ad spenders in the Democratic primary? The two billionaires: Tom Steyer and Michael Bloomberg.
So far, Steyer has outspent the sum total of the entire field (excluding Bloomberg). From the time Bloomberg officially entered the race last November, he has been spending 2-3x more than Steyer. And yet, the latest national poll shows Bloomberg in fifth place, with the support of 5.8% of Democrat voters. Steyer fares even worse: ninth place, with paltry support of 2.2%.
Massive amounts of television advertising allowed these billionaires to enter the ring, but the money does not seem to be helping them land any punches.
If advertising — both television and digital — is not effective at shifting voting patterns, are there ANY situations where ad campaigns can shape political opinions?
The Washington Post examined the Russian influence in Kyrgyzstan, a former Soviet republic with a majority population of non-denominational Muslims. The study compared the attitudes of Kyrgyzstanis who regularly watched Russian language TV programs versus those who rarely watched Russian language TV. (Important point: the vast majority of Russian television is state-controlled, espousing pro-government sentiments). The results of the study were fascinating:
Kyrgyzstanis who watch Russian television every day are roughly twice as likely to express any opinion of Russian political institutions than those who watched once or twice a year.
Let’s stop there for some context — about 60% of Kyrgyzstanis speak Russian. Presumably, Russian-speaking Kyrgyzstanis make up the vast majority of the audience for Russian language television. That fact indicates significant selection bias in this methodology. You should expect that people who watch Russian TV (and speak Russian) are more likely to express an opinion about Russian politics than people who do not watch Russian TV (and do not speak Russian). That said, the rest of the study offers some insight:
What about efforts to get Kyrgyzstanis to think highly of Russia and its policies? We found Russian media was more effective in influencing their perceptions about topics they knew little about personally than topics they knew well from personal experience. For instance, respondents who often watched Russian television were less likely to have a positive view of the United States and more likely to blame the West for the conflict in Ukraine — two narratives frequently promoted by the Russian media. But Russian TV had no effect on Kyrgyzstanis’ opinions about Russia’s social protections for its residents. That’s because many Kyrgyz citizens have either traveled to Russia to work, or know a family member or neighbor who has.
In short, the more remote the topic, the more influential the foreign propaganda.
In this case, Russian language television was successful on two fronts: (1) raising awareness, and (2) influencing opinions about unfamiliar topics. However, the propagandistic tone of Russian television was unable to shift people’s opinions about familiar topics — the things people learned from personal experience, rather than TV.
The term “propaganda” carries a negative connotation, but government advertising can be used to exert positive change. Consider a research paper by Arthur Blouin and Sharun Mukand, analyzing the impact of radio propaganda on ethnic attitudes in post-genocidal Rwanda:
We exploit[ed] variation in exposure to the government’s radio propaganda due to the mountainous topography of Rwanda. Results … show that individuals exposed to government propaganda have lower salience of ethnicity, have increased interethnic trust, and show more willingness to interact face-to-face with members of another ethnic group.
I think this experiment was ingenious. To counter selection effects, they used a case study where natural factors — as opposed to language — determined people’s exposure to media. Depending on the location of mountains and radio towers, individual people received more or less messaging. This ground-breaking study confirmed that “ethnic identity can be manipulated by governments.”
So does propaganda work or not?
Some of these examples seem to contradict each other. Tom Steyer can spend billions on advertising, but cannot manage to shift people’s opinions of marginally different candidates. A much smaller spend in Rwanda, though, can significantly influence people’s view about ethnic attitudes in a post-genocidal country.
So which example is correct? Does propaganda change minds or not?
In order to understand how both of these ideas can be true, let’s step back and reflect on some general principles of advertising.
What can advertising actually accomplish?
It makes you AWARE of a product, service, or idea. (Have you heard of a restaurant called McDonald’s?)
It helps you CONSIDER a product, service or idea. (Have you thought about eating at McDonald’s for breakfast?)
It EXPOSES you to a product, service, or idea. The “exposure effect” is well documented — the more we are exposed to something (all things being equal), the greater our affinity for the thing. (If you’re going for lunch, you have many options. You might choose McDonald’s if you have been exposed to more of their ads than any of their rivals.)
For all three of those functions, there is one key idea: advertising works ON THE MARGIN. For example, no one sees their first commercial for The Church of Latter-day Saints, and then drops everything to become a Mormon.
To maximize the impact of your advertising, you usually try to entice moderate users to consume moderately more. If a person hates sugary drinks, then no amount of Coke advertisements will get them to buy Coke. Conversely, if a person drinks so much Coke every day that they pour it into their breakfast cereal, then more Coke ads are unlikely to increase their consumption. Effective ads for Coke might nudge people who drink a can every 9 months to increase their frequency to one can every 8 months.
One can every 8 months?! That doesn’t seem like effective advertising! But think about the aggregate — moving a customer from one can every 9 months to one every 8 months represents an almost 10% increase in Coke purchases. Imagine achieving those results for a significant segment of customers. Your opinion of that advertising campaign would shift for the positive.
Let’s return to Moonshot, the company that uses conventional online marketing strategies to counter violent extremism. Can we predict their likelihood for success?
The good news: they are thinking about impacts on the margin. From the New York Times profile:
The idea is not to berate the adherents of extremist ideology, but to help them change their minds themselves.
The bad news: they might encounter two substantial challenges.
Moonshot’s algorithms target people searching Google for extreme terms like “Hitler is a Hero.” Remember: we know that Coke ads are not effective at persuading people who detest soft drinks. Like any mindset, racism exists on a spectrum, ranging from subtle discrimination to overt calls for violence. De-radicalization would probably prove more effective at encouraging a person to abandon their racist sports stereotypes than at convincing a person to throw away their KKK hood and join the NAACP.
The ads are “below the line.” Only select people will ever see the Moonshot-sponsored ads (and many of those people realize they are among the small number of internet users seeing the ads). This strategy is drastically different from the radio ads disseminated by the Rwanda government. The experiment used mass advertising to not only inform people that ethnic unity was positive, but to also make them aware that other people thought ethnic unity was positive. The implication of the ad campaign was clear: if you abandon your old ethnic biases, you will develop new friendships. Search ads provide the opposite message.
As the experiment in Kyrgyzstan showed, propaganda can be used to increase awareness and influence opinions, when someone does not hold an existing opinion. But using words and images to get people to change their current attitudes, let alone their fundamental beliefs is very, very difficult. If anyone tries to convince you they created a “secret sauce made from psychographics,” you should remain skeptical. Instead of evil geniuses, they’re more likely to be evil con artists (or evil incompetents). If you sign up with a firm like Cambridge Analytica, they aren’t going to trick your potential customers — they are just going to trick you.
Keep it simple,
Edward
If you enjoyed today’s newsletter, I encourage you to click the little heart button below the “Subscribe Now” box. Thanks!
Edward Nevraumont is a Senior Advisor with Warburg Pincus. The former CMO of General Assembly and A Place for Mom, Edward previously worked at Expedia and McKinsey & Company. For more information, including details about his latest book, check out Marketing BS.