A Business Perspective on the Snippet Tax

by Rob Pegoraro on November 19, 2012

If you only read the first sentence or two of this post and give up on the rest, should I blame Google?

That’s the essence of the logic being dug up by frustrated publishers in Europe and South America. They argue that by providing Web readers with hyper-condensed descriptions of their stories, Google costs them traffic and therefore money.

If this sounds familiar, it should. Back in 2009, some U.S. news organizations, including the Associated Press and News Corporation, busied themselves bleating about the unfairness of selfish search engines.

Yet these companies never took the obvious step of editing robots.txt files on their sites to ban search engines from indexing them. And within months, the furor faded as they realized they weren’t going to get kickbacks from search engines and had no hope of Congress mandating any.

Other countries, however, have had narrower definitions of fair use of copyrighted material and legislatures more sympathetic to the media.

So in Germany, then France and Italy, governments have recently responded to complaints of publishers by considering laws that would expand their copyright to cover even snippets of news stories—in essence, a “quotation tax.” (DisCo’s take on the legal ramifications of this issue appears here.)

In Brazil, 154 newspapers have already opted out of Google News.

I can grudgingly respect publishers who, like those in Brazil, have used their longstanding option to opt out of Google News or block Google or all search engines from even indexing their stories. They’ve employed available tools to carry out a business decision, with the risk that entails. For the Brazilian publishers, there may not be much: They say they’ve only lost five percent of their Web traffic since.

(Disclosure: I’ve spoken at Google events; last year, the company paid me for one appearance.)

But the idea of legislating away your risk by inventing a right to be paid when other sites summarize your work—now that exhibits toxic levels of delusion.

Let’s start with the actual mechanics here. To check what I might see on Google News compared to the original, I searched for “Google News publishers” there; its most relevant result was an insightful column by the New York Times’ David Carr.

Google’s algorithm devoted 139 characters to an inaccurate summary of that 1,149-word piece:

American publishers eventually decided that the only thing worse than being aggregated by Google News was not being aggregated at all, but …

There is some limit at which a description of another story provides enough detail to free a reader from clicking through—ask any newspaper that has staffers aggregating news reported elsewhere. But most one or two-sentence search snippets get nowhere near that limit.

And few readers who lose all curiosity in a subject after reading such a curt description would scan past the headline of the original anyway.

(For that matter, many listings in Google News only feature a linked headline. If you’ve forgotten, let Web inventor Tim Berners-Lee remind you: nobody needs permission to link.)

Then remember another big source of traffic online: If a Google-generated distillation can tell readers everything they need to know, why not a human-written summary on Twitter or Facebook?

Social-media networks haven’t yet surpassed search engines as traffic sources at big sites like the New York Times, but at such smaller operations as Talking Points Memo or The Atlantic, they already send about half as many readers as search sites. When they approach parity, will publishers want to bill them too?

My bet is that some will, and that we’ll be reading heartfelt cries for justice by confused publishers along those lines in no more than three years.

Previous post:

Next post: