What is the view on how Google treats content from xml / rss feeds? Surely, this is duplicate content. Why doesn’t Google see this as a bad thing?
It would make no sence to to build a news feed if it would give one a duplicate penalty. Would it?
You can probably find more info here, it should be a common problem for webmasters
Directly reformatting XML and RSS, etc is kinda shady and most likely illegal – if it’s copyright, or the author has some license against the usage. If this content exists elsewhere and has been noted by the SE’d then yes, you WILL get penalized.
If you distribute the content by parsing the content with scripts that have a clearly defined source in the HTML then the spiders will read through it, see the external link and won’t tag that content as duplicate.
Don’t take any advice from these guys.
How would you like if at the very same second when you are posting a new article on your site, it would be added to your feed and all the social services would be pinged to robot your new content? You should love it.
How would you like if in a matter of minutes after you posted that new article,
google would index it? You would love it, huh?
Get your xml together. This is 2007, you are already late.
Now this content is my own but a lot of internet uses want help in categorising information and content and then pushed to them. Build services that supports this and you will have visitors to your website contiously.
Good content that are user friendly will drive traffic!
Also write good and intressting subject lines. An intressting headlines get a lot of attention!
Please login or Register to submit your answer