Tumblr, Pornography, and Unseemly Arbitrage

Welcome to Marketing BS, where I share a weekly article dismantling a little piece of the Marketing-Industrial Complex — and sometimes I offer simple ideas that actually work.

If you enjoy this article, I invite you to subscribe to Marketing BS — the weekly newsletters feature bonus content, including follow-ups from the previous week, commentary on topical marketing news, and information about unlisted career opportunities. 

Thanks for reading and keep it simple, 

Edward Nevraumont

Tumblr, Pornography, and Unseemly Arbitrage

In mid-August, Automattic — the owners of WordPress — acquired Tumblr, a social networking platform that hosts millions of blogs and other pieces of user-generated content. The price? A paltry $3 million (according to rumors, at least; the actual sum wasn’t disclosed). 

An article from the Wall Street Journal provided some interesting details: 

Verizon Communications Inc. has agreed to sell its blogging website Tumblr to the owner of popular online-publishing tool WordPress.com, unloading for a nominal amount a site that once fetched a purchase price of more than $1 billion. [Emphasis mine]

Shortly after the announcement, a few Marketing BS subscribers asked for my “take” on the deal (and how a company with a billion-dollar valuation ended up on the scrap heap). 

I think that Tumblr “died” for three reasons:

  1. They missed mobile.

  2. They mismanaged the transitions from indie company to Yahoo to AOL to Verizon.

  3. They failed to implement an effective method for filtering out pornographic content. 

In terms of why Tumblr stumbled and how Automattic might turn things around, I don’t have anything particularly insightful to add at the moment.

Instead, I want to consider the ways that pornography — as well as other “unseemly” content on the internet — impacts digital marketing strategies. 

In November 2018, the Apple App Store removed Tumblr from its offerings. The problem? Independent sources discovered that the Tumblr platform contained pornographic content involving children. A Tumblr spokesperson (quoted in CNET) released the following statement:

Every image uploaded to Tumblr is scanned against an industry database of known child sexual abuse material, and images that are detected never reach the platform. A routine audit discovered content on our platform that had not yet been included in the industry database. … Content safeguards are a challenging aspect of operating scaled platforms. [Emphasis mine]

The developer guidelines for the App Store (obviously) prohibit child exploitation. Apple also forbids “Objectionable Content.”

And Tumblr — at that time — featured a broad range of content that could be considered “objectionable,” including NSFW jokes, pornographic images, and erotic fan fiction. In order to return to the App Store, Tumblr concluded they had no other choice — they needed to ban all “Objectionable Content” from their platform. 

Of course, removing objectionable content from a massive user-generated content site was easier said than done. Let’s consider one specific element of “objectionable” content: pornographic images and videos. To prevent pornography from appearing on Tumblr, the company relied on “artificial intelligence” to identify and block any posts that could be considered pornographic.

People in the tech industry often claim, “we can just use AI for that,” without any clear explanation of how that AI system might function. Remember: before AI can perform any task, a human must provide the system with detailed instructions. For Tumblr, that would have meant loading the AI system with precise parameters for which types of content to block. In other words, a human needed to specify a clear definition for “pornography.”


Defining the Undefinable

One of the most famous attempts to define pornography predates the internet. In 1964, the US Supreme Court heard Jacobellis v. Ohio, the case of a movie theater owner convicted of possessing and exhibiting an obscene film (The Lovers). The case considered whether the First Amendment trumped states’ rights to prohibit material that treated “sex in a fundamentally offensive matter.” Ultimately, the Supreme Court sided — by a 6 to 3 vote — with the owner of the movie theater, overturning his conviction. Notably, the nine justices could not agree on guidelines for identifying and restricting “obscene” content. In fact, SIX DIFFERENT opinions were released: four concurring opinions, none of which gained support of more than two justices, along with two dissenting opinions submitted by the three justices who voted to uphold the convictions. 

Exemplifying the ambiguity of the Court’s view of pornography, Justice Potter Stewart wrote:

I shall not today attempt further to define the kinds of material I understand to be embraced within that shorthand description; and perhaps I could never succeed in intelligibly doing so. But I know it when I see it, and the motion picture involved in this case is not that. [Emphasis mine]

More than 50 years later, Justice Stewart’s quote stands as an often-cited definition for pornography: “we know it when we see it.” 

For technological applications, however, that definition is fundamentally problematic: how can an AI system identify pornography by knowing “when it sees it”? Programmers would need to either (1) provide the AI system with very clear rules about what constitutes pornography, or (2) implement a machine learning process that reviews a massive trove of images, labeling each one as “porn” or “not porn.” For Tumblr, these tasks weren’t just philosophical ideas about the capabilities of artificial intelligence — they were real-world barriers preventing their return to Apple’s App Store. 

What happened next?

In short, Tumblr designed and implemented a system to remove and block pornography from their platform. The process was moderately successful at removing objectionable content, BUT… it also screened out many “false positives” — pieces of content that would not violate Apple’s guidelines. 

The Twitterverse exploded with criticism of Tumblr. Will Wheaton (Star Trek alum and celebrity “geek”) noted that images of men kissing were being flagged as inappropriate; his argument that Tumblr’s filters contained an anti-LGBTQ bias were widely shared. Several sources (like this Twitter thread) shared the hilariously NON-offensive content that Tumblr’s AI was catching (and banning). Images included a photograph of a cave, cartoons of people playing ping pong, someone drinking tea, and even a re-post of Tumblr’s announcement about the types of content that would be banned.

So, how did Tumblr’s “objectionable content filter” impact their user base?

The results are complicated to assess. 

In the month before Tumblr was pulled from the Apple App Store, they received approximately 521 million pageviews. And what about the month after the pornography ban was implemented? Tumblr’s traffic plunged to 437 million pageviews — a decline of 17%.

Without question, a 17% drop-off in traffic is significant. But the reasons why people abandoned Tumblr are even more significant (at least in terms of understanding the impact on marketing). Here are two possible explanations for the drop-off

  1. Some people visited Tumblr to view pornography. After that content was intentionally removed by Tumblr, those people left the site — presumably to find pornography elsewhere. 

  2. Some people visited Tumblr to engage with non-pornographic content. After that content was removed by Tumblr’s filters (because of AI error and/or political bias), those people left the site — probably to visit other online communities like Reddit, etc. 

Back in 2016, a team of Italian academics studied the relationship between producers and consumers of “adult content” on Tumblr. Their conclusions were startling: less than 1% of Tumblr’s producers were posting pornography, but that content drove approximately 22% of the consumer traffic. In other words, almost one-quarter of Tumblr users (intentionally) visited pages with pornographic content. 

Let’s think about the logical relationship between all of these ideas. If (1) approximately 22% of Tumblr users intentionally viewed pornographic content, and (2) Tumblr removed (most) pornographic content from their site, then couldn’t we conclude that (3) monthly pageviews would drop by approximately 22%?

But that didn’t happen

Remember that pageviews only declined by about 17%. That’s still a massive drop for a forum-style website, of course, but not as large as we might have predicted. And what about the people who threatened to boycott Tumblr over its (allegedly) discriminatory AI system? Shouldn’t they have pushed Tumblr’s pageviews even further off the cliff? 

Maybe some people who previously visited Tumblr specifically for pornography decided to stick around and explore other content on the site (a digital version of people who “read Playboy for the articles”)? Or perhaps Tumblr gained new users who previously avoided Tumblr due to worries they might inadvertently stumble upon pornographic content (something that happened to almost half of Tumblr’s users)?

How did the 17% drop-off impact Tumblr?

One thing is clear: shedding so many users decreased the popularity of the platform. 

But let’s focus on the financial fallout. And that depends on how valuable a pageview is — specifically a view of pornographic content.

One method to determine this value involves visiting a pornographic website and searching for their advertising rate card. Don’t worry, I’m not going to link to one of those sites! (I’m already curious if Gmail’s filtering algorithms will decrease the delivery rate for this email). Here is a portion of a rate card someone shared on Quora

For US-based traffic, the CPMs (the cost per thousand advertisement impressions on a webpage) for a banner ad was 7 cents. Important note: 7 cents is the quoted rate, whereas the actual paid rates tend to be negotiated lower (or capacity goes unused, bringing down the average revenue).

How does 7 cents per thousand impressions compare to rate cards in other verticals?MonetizePros aggregates CPM rates across a wide variety of verticals and placements. They estimate the average display banner on the internet has a CPM of $2.80 — or about 40 times what it costs to advertise on a pornography website.

Let’s try to ballpark what this information might have meant for Tumblr’s bottom line. Accepting the $0.07 and $2.80 figures as accurate, here’s how things might have unfolded: 

  • Losing 17% of their traffic only cost Tumblr 0.42% of their revenue. (17% / 40)

  • Since 22% of Tumblr’s original traffic was pornographic content, but they only lost 17% of their pageviews, that means approximately 5% of their traffic switched from pornographic to traditional content. (22% – 17%)

  • Because traditional content monetizes at 40x the rate of pornographic content, Tumblr might have actually increased their revenue by ~4.875%. ((5 – (5% / 40))

  • Based on these projections, the net effect would have been a ~4.5% increase in revenue. (4.875% - 0.42%)

That’s a lot of numbers. If I lost you, here is the summary:

  • A lot of Tumblr traffic was pornographic content. 

  • After Tumblr implemented a ban on obscene content, they lost all of that traffic.

  • Because pornographic content generates very little advertising money, Tumblr’s revenue was barely impacted.


Why doesn’t pornography traffic monetize?

This seems like an odd question to even ask, right? 

We all know why it doesn’t monetize: companies don’t want to be associated with pornography or any other content that is considered “unseemly.”

But we also know that many people view pornography on a regular basis. Data about the percentage of Americans who visit pornographic websites is notoriously unreliable (because most studies use self-reported information… ). Even calculating what percentage of internet content is pornographic can be elusive; although a 30% figure is regularly cited, sources like Statista provide a much lower estimate of 4%. 

I love this quote from apenwarr (the blog of Tailscale CEO Avery Pennarun): 

Someone who works on web search once told me that they already have an algorithm that guarantees the maximum click-through rate for any web search: just return a page full of porn links. (Someone else said you can reverse this to make a porn detector: any link which has a high click-through rate, regardless of which query it's answering, is probably porn.)

Even if we can’t pin down specific numbers about the amount of pornography online, we do know that porn is treated very differently than the rest of the internet. On social media platforms, people share intimate details of their lives, yet pornography remains a (mostly) taboo subject. Many people don’t discuss their secret viewing habits with their friends — or even their spouse. 

For a company, advertising on a pornographic website is a risky decision. Risky for how customers might judge the company. And — perhaps even more importantly — risky for how employees and suppliers might react (see Marketing to Employees and Marketing for Investors). 

As such, most of the companies that advertise on porn sites are other porn sites. Ads are also purchased by companies in other “unseemly” categories that don’t need to worry about the reputational stigma of appearing on a porn site (e.g., male enhancement pharmaceuticals, “friend” finders, “adult” video games, torrent websites, spambots, etc.).

For everyone else, advertising on pornographic sites is considered so repugnant that ad space is available at a 97.5% discount.

So…would any mainstream company actually go ahead with that plan? 

When you offer a 97.5% discount, you know that some company will roll the dice. 

In 2017, GrubHub paid $287 million to Yelp in order to acquire Eat24 — an online food ordering service. Just two years earlier, Yelp spent less than half that amount ($134 million) to purchase Eat24. And before the Yelp deal, Eat24 was a bootstrapped startup, trying to find ways to grow with a very limited marketing budget.

Someone at Eat24 discovered the incredibly low amount of money required to buy display ads on pornography sites. Eat24 decided to give it a shot, using specially-designed ads like this one: 

How did Eat24’s experiment fare?

Exceptionally well — at least according to their own description (they chronicled the experience in a blog post that you can find archived on various corners of the web). 

Setting aside the reputational barriers of advertising on pornographic sites, I think most marketers would believe that, “even if the ad space is dirt cheap, the quality of the traffic must be terrible.” For a different perspective, here’s what Eat24 had to say about the quality of the traffic they saw from their ads:

No matter what metric you want to use to define success, our campaign kicked ass all the way across the board. Impressions? Our porn banner ads saw three times the impressions of ads we ran on Google, Twitter and Facebook combined. Click through? Tens of thousands of horngry Americans clicked our ads. Yeah, but did they convert? Psshhh, please. We saw a huge spike in orders and app downloads during the time our ads were live, especially late at night when that insatiable desire for DP (double pepperoni) is at its most intense.

Did we mention the cost? We did? Well, it bears repeating. We were able to achieve the stellar metrics mentioned above all for the low low price of 90% less than what the big guys charge per 1,000 impressions. That’s right, we saved 90%. Nine zero.

With such a low CPM, we were able to maintain a firm and healthy budget for weeks. On other platforms (especially Facebook), we blew through our media spend in a matter of minutes (never happened to us before baby, we swear).

The general consensus is that it’s more expensive to acquire new customers than retain existing ones, but of course, that’s just another convention flipped right on its ass by porn. Of the total traffic generated by our ads, over 90% were first-time visitors to Eat24.com. We were reaching an almost entirely new market. Our porn banners were generating new customers cheaply. And guess what? They were coming back too (they always come back). New customer retention on porn banners was four times higher than that of our Facebook ads.

I’m not quite sure how to quantify metrics like, “Tens of thousands of horngry Americans clicked our ads” and “Psshhh, please. We saw a huge spike in orders… .” In any case, Eat24 seemed to believe that their ads on pornographic sites performed much, much better than any other ads they were running — at 90% lower cost.


The Gray Market

Given Eat24’s success, why doesn’t every company advertise on pornographic sites? Are the unsavory associations really that bad?

Yes. Yes, they most certainly are. 

Consider what happened to Lora DiCarlo, a female-focused health and wellness company. In 2018, the company’s flagship sex toy (designed by and for females), earned a coveted “Innovation Award” from the Consumer Technology Association, alongside products like the Nissan Leaf, Blue Frog’s Buddy the Robot, and earbuds that provide real-time language translation. So far, so good. 

But after Lora DiCarlo submitted their application for exhibitor space at the Consumer Electronics Show (one of the perks of winning an Innovation Award), the CTA stripped the company of both the prize and the booth space, because their product was reassessed as “immoral, obscene, indecent [and] profane.” As a result of a PR debacle, the CTA eventually reversed their reversal, handing over both the award and an apology. 

The moral of the story: if your company engages with the “gray market” of sex, you increase your exposure to risk and controversy.  

Think about this idea in the context of Eat24. During their time as an independent, bootstrapped company, Eat24 proudly shared stories about their “innovative” strategy of advertising on pornographic websites. After the ($134 million) Yelp acquisition, however, all of Eat24’s associations with pornography vanished from the web. 

If pornography was such a great marketing channel for Eat24, why did they abandon that strategy after the Yelp acquisition? The answer is straightforward: because now the company had more to lose. 

Generally speaking, small firms can afford to take risks. When media outlets profile an emerging company, they usually offer glowing praise. In contrast, coverage of giant corporations often highlights critical flaws and transgressions. That’s how media (and comedy) tends to function: punch up, not down. 

When plucky upstarts like Eat24 advertised on a pornographic site, journalists described the tactic as bold and inventive. On the flip side, when the New York Times revealed the fact that Google had embedded trackers on pornographic websites, privacy analysts berated the company for its intrusive behavior (see: “Bad Companies”).

When Small, Be Different

A fundamental truism in marketing: small companies can get away with things that big companies cannot. Back in 2011, The Economist shared a study that illustrated how negative publicity usually hurt large multinationals, but actually helped small firms. 

The rationale? For massive companies, bad publicity will tarnish their well-established reputation or brand. Small firms, on the other hand, don’t usually have much of a reputation or brand to ruin. Negative publicity could damage their (already insignificant) reputation, BUT it also raises their awareness. The increase in awareness is worth more than any harm resulting from how potential customers learned about the company. 

Small companies (with weak brands) should capitalize on this principle, taking more risks that might offend more people. Media strategist Ryan Holiday wrote a fascinating book on these types of strategies. One of my favorites: helping authors organize protests against themselves! (because having someone — or even better, lots of someones — say you are a terrible human is much better for business than having everyone ignore you).

Here’s one of the marketing “rules” I truly believe: there’s a lot of value in “unique marketing channels.” If you can avoid competing with everyone else, you will pay less to acquire your customers. In simplest terms, zig when others zag. Of course, that’s not always so easy — there are probably good reasons why everyone else is zagging. Despite the risks for zigging, your company should consider strategies that your competitors are afraid to try.

A few quick examples of corporate zigging:

  • Eat24 advertised on pornographic websites. 

  • Airbnb developed ways to spam Craigslist. 

  • Groupon stood outside restaurants, telling people to use their app for purchases they were about to make anyway. 

As you could probably guess, all of these practices fell to the wayside after the companies grew larger and more “respectable.”


Final Thoughts

Paul Graham, co-founder of Y Combinator, tells a great story about a method for evaluating the likelihood that entrepreneurs will experience failure or success. The short version: he measured email response time. Successful startup founders (well before they were successful) would often respond to emails in minutes. Unsuccessful startup founders (when the jury was still out) would take days to respond. His theory on why email response time might matter? One of the few advantages startups hold over big multinationals is the ability to move quickly. If startups are not leveraging their ability to move fast, they won’t be able to win on all the other stuff (where the big companies have an overwhelming advantage).

Takeaway for today: I’m not saying that all small companies should be advertising on porn websites.

But maybe?

Keep it simple,

Edward

If you enjoyed this post, I encourage you to click the little heart icon below my bio. Thanks!

Edward Nevraumont is a Senior Advisor with Warburg Pincus. The former CMO of General Assembly and A Place for Mom, Edward previously worked at Expedia and McKinsey & Company. For more information, including details about his latest book, check out Marketing BS.