I began work on a site earlier this year that an excellent photo gallery, but hadn't yet explored any SEO tactics to promote it. My first instint was to look at getting the images on one or more of the social bookmarking sites--specifically Flickr because another site I had worked on in the past had gotten a lot of exposure when we posted our images with them.
One of the questions I got about the images users would be provided if they typed in a keyword like Wall Street or Microsoft Windows was "How do you know what kind of image is going to come up." The truth is that there is a lot of randomness. A hundred images can be tagged "Wall Street," but of those about 50-60 are probably not relevant to the user's search and another 20 or so are of such poor quality that they shouldn't have been uploaded in the first place.
But now there is PicURLs, a site that aggregates photos from the nine most popular photo social bookmarking sites and displays the most popular images for each site so that only quality photographs and well-tagged images are shown. This only works if users tag photos well and appropriately. But I have faith in the users of social media. While some tag photos indescriminately to gain exposure, many understand how important good tags are to user experience. And that is why I think we will be seeing a lot more from PicURLs, and other aggregators like it, in days to come.
Thursday, November 8, 2007
Saturday, November 3, 2007
Posting Duplicate Content Must Be Harmful To Our Google Ranking
I am responsible for content development for one Web site that has taken an inexplicable dive in traffic over the past two and a half months and after scores of conference calls with everyone in the company from IT to SEO to the head of metrics we still haven't been able to figure it out. One thing we have been able to isolate is the fact that over the years the magazine has changed the domain a couple of times before finally deciding on its current URL a few years ago. I rediscovered this Google blog "Dealing deftly with duplicate content" today which has some helpful tips, but am suprised at the writer's last comment:
With Google cracking down on all the rogue linking schemes one can't help but think that maybe this employee isn't in touch with reality. Maybe this post is just out of date. In any event, the other tips are very helpful. Here they are:
Don't worry be happy: Don't fret too much about sites that scrape (misappropriate and republish) your content. Though annoying, it's highly unlikely that such sites can negatively impact your site's presence in Google. If you do spot a case that's particularly frustrating, you are welcome to file a DMCA request to claim ownership of the content and have us deal with the rogue site.
With Google cracking down on all the rogue linking schemes one can't help but think that maybe this employee isn't in touch with reality. Maybe this post is just out of date. In any event, the other tips are very helpful. Here they are:
- Block appropriately: Rather than letting our algorithms determine the
"best" version of a document, you may wish to help guide us to your preferred
version. For instance, if you don't want us to index the printer versions of
your site's articles, disallow those directories or make use of regular
expressions in your robots.txt file.- Use 301s: If you have restructured your
site, use 301 redirects ("RedirectPermanent") in your .htaccess file to smartly
redirect users, the Googlebot, and other spiders.- Be consistent: Endeavor to
keep your internal linking consistent; don't link to /page/ and /page and
/page/index.htm.- Use TLDs: To help us serve the most appropriate version of a document, use top level domains whenever possible to handle country-specific content. We're more likely to know that .de indicates Germany-focused content, for instance, than /de or de.example.com.
- Syndicate carefully: If you syndicate your content on other sites, make sure they include a link back to the original article on each syndicated article. Even with that, note that we'll always show the (unblocked) version we think is most appropriate for users in each given search, which may or may not be the version you'd prefer.
- Use the preferred domain feature of webmaster tools: If other sites link to yours using both the www and non-www version of your URLs, you can let us know which way you prefer your site to be indexed.
- Minimize boilerplate repetition: For instance, instead of including lengthy copyright text on the bottom of every page, include a very brief summary and then link to a page with more details.
- Avoid publishing stubs: Users don't like seeing "empty" pages, so avoid
placeholders where possible. This means not publishing (or at least blocking)
pages with zero reviews, no real estate listings, etc., so users (and bots)
aren't subjected to a zillion instances of "Below you'll find a superb list of
all the great rental opportunities in [insert cityname]..." with no actual
listings.- Understand your CMS: Make sure you're familiar with how content is
displayed on your Web site, particularly if it includes a blog, a forum, or
related system that often shows the same content in multiple formats.
Labels:
content development,
google ranking,
metrics,
SEO
Subscribe to:
Posts (Atom)