Saturday, November 3, 2007

Posting Duplicate Content Must Be Harmful To Our Google Ranking

I am responsible for content development for one Web site that has taken an inexplicable dive in traffic over the past two and a half months and after scores of conference calls with everyone in the company from IT to SEO to the head of metrics we still haven't been able to figure it out. One thing we have been able to isolate is the fact that over the years the magazine has changed the domain a couple of times before finally deciding on its current URL a few years ago. I rediscovered this Google blog "Dealing deftly with duplicate content" today which has some helpful tips, but am suprised at the writer's last comment:

Don't worry be happy: Don't fret too much about sites that scrape (misappropriate and republish) your content. Though annoying, it's highly unlikely that such sites can negatively impact your site's presence in Google. If you do spot a case that's particularly frustrating, you are welcome to file a DMCA request to claim ownership of the content and have us deal with the rogue site.


With Google cracking down on all the rogue linking schemes one can't help but think that maybe this employee isn't in touch with reality. Maybe this post is just out of date. In any event, the other tips are very helpful. Here they are:

  • Block appropriately: Rather than letting our algorithms determine the
    "best" version of a document, you may wish to help guide us to your preferred
    version. For instance, if you don't want us to index the printer versions of
    your site's articles, disallow those directories or make use of regular
    expressions in your robots.txt file.
  • Use 301s: If you have restructured your
    site, use 301 redirects ("RedirectPermanent") in your .htaccess file to smartly
    redirect users, the Googlebot, and other spiders.
  • Be consistent: Endeavor to
    keep your internal linking consistent; don't link to /page/ and /page and
    /page/index.htm.
  • Use TLDs: To help us serve the most appropriate version of a document, use top level domains whenever possible to handle country-specific content. We're more likely to know that .de indicates Germany-focused content, for instance, than /de or de.example.com.
  • Syndicate carefully: If you syndicate your content on other sites, make sure they include a link back to the original article on each syndicated article. Even with that, note that we'll always show the (unblocked) version we think is most appropriate for users in each given search, which may or may not be the version you'd prefer.
  • Use the preferred domain feature of webmaster tools: If other sites link to yours using both the www and non-www version of your URLs, you can let us know which way you prefer your site to be indexed.
  • Minimize boilerplate repetition: For instance, instead of including lengthy copyright text on the bottom of every page, include a very brief summary and then link to a page with more details.
  • Avoid publishing stubs: Users don't like seeing "empty" pages, so avoid
    placeholders where possible. This means not publishing (or at least blocking)
    pages with zero reviews, no real estate listings, etc., so users (and bots)
    aren't subjected to a zillion instances of "Below you'll find a superb list of
    all the great rental opportunities in [insert cityname]..." with no actual
    listings.
  • Understand your CMS: Make sure you're familiar with how content is
    displayed on your Web site, particularly if it includes a blog, a forum, or
    related system that often shows the same content in multiple formats.

1 comment:

Anonymous said...

Who knows where to download XRumer 5.0 Palladium?
Help, please. All recommend this program to effectively advertise on the Internet, this is the best program!