The Duplicate Content Penalty is one of Google’s least understood policies and Webspam Chief Matt Cutts has taken to YouTube to help clear the air.

Cutts’ video comes in response to the question, “How does Google handle duplicate content and what negative effects can it have on rankings from an SEO perspective?” and is packed with useful information.

For starters, Cutts drops a bombshell by saying as much as 24%-30% of all the content on the Web currently qualifies as duplicate content. He goes on to say that Google realizes not all of that content is spam.

The search giant handles duplicate content by treating multiple, identical items as a single entity and displays only the most relevant version. (Though other versions can be viewed by end-users who feel like adjusting their filters.)

This is a big benefit for end-users.

This is good news for web publishers who regularly publish quotes and longer excerpts from other published works.

According to Cutts, publishers who use legitimate, contextual/legitimate duplicate content won’t result in a penalty. Deliberate attempts to goose PageRanking with duplicate content, however, are fair game for penalties.

Auto-generated SEO articles with fill-in-the-blank geographic locations are exactly the kind of duplicate content Google does not like and are singled out by Cutts specifically.

So there you have it, directly from the mouth of Cutts himself. Duplicate content that isn’t done in a spammy way is all good with Google.

 


Tags: , ,

Related posts: