[This post has been long-overdue. Not a rant. A cautionary tale for all Agile bloggers]
AgileScout.com is an Agile software development news site. We try to provide the freshest Agile news we can to the Agile community at large (considering our small team). We do so in several ways:
- Investigative reporting on Agile blogs
- Investigative reporting on Agile news
- Writing unique Agile Guide articles
- Writing about Agile issues
- Reviewing Agile tools, books, and methods
- Reviewing and attending Agile events
- Interviewing Agile practitioners
We build our site with SEO/SEM in mind (Search Engine Optimization and Search Engine Marketing), and we have a team of individuals who know a whole lot about SEO.
We NEVER produce wholesale duplicate content from the web. If we do quote people or blogs, we do so at a minimum. Why do we do this? Because it is absolutely TERRIBLE for your personal blog or website.
Google, Yahoo, Bing, you name it. These search engines hate duplicate content. Do a quick google search for “Why duplicate content is bad” and you’ll find out why.
So, what’s the harm in having your copied blog content on many different sites?
“Fact: Google penalizes page rank when it determines that content is duplicated by other sites.” – TopSEMTips.com
“As a result, the ranking of the site may suffer, or the site might be removed entirely from the Google index, in which case it will no longer appear in search results.” – Google.com
“Issues such as duplicate content (intentional or otherwise) can have a profound impact on your search success.” – SearchEngineWatch.com
“Generally, search engine algorithms work pretty well and your original version shows up. However, the system isn’t perfect. Michael Gray recently noted that sometimes Google gets it wrong and shows the version from a more authoritative site, even when that is not the original version.” – NineByBlue.com
In terms of the duplicate content myth, some would say that it doesn’t affect your site at all. That’s all well and good. But if there was even the slightest possibility that it would negatively affect your site, wouldn’t you want to cut out that possibility?
Here’s an example:
- I post a very cool new blog entry on my blog.
- A news aggregator site picks up my blog entry and copies and pastes the duplicated content.
- Do a google search for your blog entry.
- Wait?! What? Why does the offending news site get higher ranked than my site for MY content?
- Simple. They have a higher syndicated PR Rank, Alexa Rank, Compete Rank, etc than YOU do (If you don’t know what these are, google them).
- YOU GET DOWN-RANKED IN SEARCHES!
Oh crap! I’m totally getting scraped by these content aggregators? What can I do?
Vanessa from NineByBlue.com tells us that:
“As someone who’s providing your content for syndication, you should then just realize you’re in a competition with your syndication partners for ranking and it’s quite possible they can outrank you.”
She continues on:
“If you are able to, put together a syndication agreement that states they get your content as a benefit for their readers, not as a way to acquire search traffic for that content, then you can keep control of ranking for what you’ve written and they can provide a benefit to their audience. Don’t give away your control of your content.“
Agile content scraping?
Are we off-kilt? Out of line? Let us know in the comments.