Say what you will about The Verge, they are, at least, attempting to give shape to Web-based journalism. That’s no mean feat in an era where the more common method is to take whatever shape the Web provides. For some time now, that shape has been determined in no small part by the dictates of SEO—search engine optimization—a series of practices for making your site as visible as possible one search engines like Google.

Well, not search engines like Google so much as Google specifically. As often as not, when people talk about SEO, what they really mean is GEO: Google engine optimization. Granted, there are a few competitors like Bing and Yahoo, but Bing still measures success in terms of losing less money and Yahoo is staking its future less on search and more on providing content. Google is far and away the most important source of search traffic. It’s so important, in fact, that how they handle search has consequences for the Web publishing industry as a whole.

That’s much of what’s at issue with The Verge‘s recent beef against The Huffington Post. It started with a showy, long form article published by The Verge, “For Amusement Only: The Life and Death of the American Arcade.” It’s the sort of long form tech article on which The Verge is making its name, in part because so few commercial publishers are investing in long form online. Five days later, The Huffington Post published a page under the same title reprinting The Verge‘s own introductory text and linking back to the original article.

That practice, pitched as aggregation, is far more indicative of most Web publishing, and what happened next indicates why. Over Twitter, The Verge editors requested that HuffPo take down their post. HuffPo editors responded that The Verge had no cause for complaint—if anything, the aggregation helped The Verge by pointing HuffPo readers to their site. And while that may be so, The Verge editors were quick to point out that the HuffPo aggregation had quickly leaped above the original article as the top search result on Google. Less than a week after the original article went live, it had already been pushed off the first page of Google News results to relevant search terms. Which, if you follow SEO logic, may have been the point all along.

It’s simple really. Web publishers finance their sites with ads, and those ads generally pay in proportion to the amount of traffic that moves through the pages where they’re displayed. To the ad sales department at a site like HuffPo, what’s actually on a page next to their ads matters less than the number of people who open that page. The strategy, then, is to deploy the optimal number of pages per day, with each of those pages doing just what it takes to draw in the maximum possible number of browsers.

As it turns out, aggregation is a pretty good strategy for doing that. Compared to researching and writing your own content, aggregating requires an infinitesimal investment of time and effort. Couch it in a few quick paragraphs about how good the aggregated content is and you can just about justify having another page full of ads.

When done well, aggregation can be a useful thing. The nature of the Web is to be both vast and obscure, convenient even as it conceals. In pointing out how much traffic their post sent The Verge‘s way, HuffPo was harking to aggregation’s usefulness as a way of navigating the pocket world of the Web. The problem is that SEO (and particularly GEO) often rewards aggregation practices that harm as much as they help. That’s precisely what happened here. Knowing that it could always invoke aggregation as its justification, HuffPo posted a page optimized to trump the Verge article in Google’s search results, for which they were duly rewarded. Now people who know of the article but not what site published it, or who simply want a history of American arcades, will search for it on Google, get the HuffPo link first, and take a detour through pointless aggregation rather than find the original article directly. That’s good for HuffPo, and bad for everyone else.

It’s bad for Google, first of all, because search, which remains their bread and butter, is all about serving up the most on-point results. When they return aggregation pages rather than the content those pages aggregate, that undermine their own value to the user.

It’s bad for The Verge because it demonstrates once again that they’re swimming against the stream by playing the long form game when GEO promises such big returns on such meager investments.HuffPo may give them an initial boost in traffic the day of the aggregating post, but in the long run HuffPo ends up being a middle man, shepherding readers from Google past their own advertising on the way to The Verge‘s article. In other words, HuffPo ends up profiting by putting additional steps between The Verge and its readers.

Finally, it’s bad for the rest of us because it encourages publishers to spend more time passing us from page to page and site to site than they spend on giving us diverse articles worth reading. HuffPo does, after all, produce its own content, but there’s a clear incentive to divert resources to gaming Google’s system at the expense of original reporting. Not only does HuffPo‘s aggregation potentially stand in the way of The Verge‘s goals; it also keeps HuffPo from being as good as it could be.

BuzzFeed‘s John Herrman traced the problem back to the incentive Google provides by rewarding content scraping. He’s right, up to a point, but it’s possible to take that train of thought even further. A straightforward appeal to Google—”choose what’s best for publishing”—only takes us so far. The real problem here is that we’re beholden to Google’s algorithm in the first place.

Even if Google is every bit as benevolent as it likes to present itself, any algorithm-based search engine is bound to have exploitable behaviors. The specifics of Google’s system determine the precise character of the exploits made possible by GEO, but vulnerability to systematic exploitation is a liability of automation, not of anything Google itself has done. Because the automate the process of directing traffic, search engines are an open invitation to exploits like content farming. Google is continually refining its policies and algorithm to counter those exploits, and a big part of that process involves deciding which strategies to tolerate and which to address as abuses. Insomuch as those decisions have a disproportionate influence on Web traffic, there’s no reason why we shouldn’t regard it the process of determining the conditions for success when it comes to any site that pays its bills on the basis of paid advertising.

The way that Google decides to rank its search results, in other words, indirectly determines whether a given type of publishing will thrive or just eke by in the competitive jungle of the Web. Right now, it favors aggregation over original, long form reporting, but the underlying problem is not that Google has decided to rank results this way rather than another. The real problem is that Google’s dominance makes certain SEO practices virtually unanswerable. If there were a half dozen search engines with equal market shares, each fielding their own specialized algorithms and policies, GEO wouldn’t matter as much. While HuffPo‘s aggregation strategy might give it a ranking advantage on one or two, the others could put the focus back where it belongs. The trick for consumers would be knowing which search is best for which type of content.

is the founder and editor-in-chief of Culture Ramp.
— Please submit all corrections, responses and rebuttals as letters to the editor.