Showing posts with label Google. Show all posts
Showing posts with label Google. Show all posts

Thursday, March 20, 2008

Is Google So Hard To Find?

How difficult is it to perform a search on Google? Usability expert Jacob Nielsen posed this question in the March 17, 2008 edition of his weekly e-mail newsletter Alertbox.

Apparently the process of getting to the search engine -- not the act of performing the search itself -- was unattainable for 24% of the participants in Nielson's latest round of usability research. That's a big number.

So what happened? Were the users asked to type in the domain? If the testing was done recently this pesky little site may have been an issue. I would highly doubt that the participants had any knowledge about top-level-domains.

Nielson offers little information about how the study was conducted. The one point he did make was that his team was "recruiting above-average users, so the success rate across all Internet users is probably lower than our finding." Says Nielson:

On the one hand, 76% is a high success rate. On the other hand, getting to Google is a very simple task.


A VERY simple task. I recently wrote about Google's redesign of their advanced search page. In that post I said that most users don't fully understand how a search engine works in the first place.

The implications of this study could be huge. What does it mean for Web developers, designers, etc. if we learn that the average user still doesn't know how to use, for instance, the Web browser. If Google is so hard to find, what does it mean for the rest of us?

Monday, December 10, 2007

ALT Tags According To Google's Matt Cutts

I had been doing a lot of research about how to properly tag images since the summer when I read an article in Digital Web about Google Universal Search. There was suprisingly little information available at the time, considering how big social photo sites like Flickr have become.

Finally, Google is speaking vis a vis Matt Cutts -- head of Google's webspam team. In Cutt's ALT tag video he talks about how you can use ALT tags to tell Google more about what is in an image file and boost it's page ranking.

Here are some highlights:

  1. Including ALT text with your images will help Google to better identify its contents


  2. ALT tags do not need to be long. Cutts' tag was 7 words. Even 25 is too much.


  3. An ALT tag used in conjuction with a descriptive filename (cat.jpg versus ds134.jpg, for example) will improve the searchability of your images even more

Sunday, October 21, 2007

Everything you need for Googlebot to work correctly

It seems that the bloggers at Google have been talking a lot about Googlebot, so I am going to do a little link roundup and maybe come back and do a post or two about one aspect of Googlebot or another. Having read each of these entries in detail, it seems that taken as a group a Web developer or Webmaster should have just about everything they need to make sure Googlebot is functioning correctly on their site.
  1. All about Googlebot - Q&As about about robots.txt files and Googlebot's behavior fielded by Google Webmaster Central founder Vanessa Fox.



  2. How to verify Googlebot - Google SEO specialist Matt Cutts talks about how to determine a bot is authentic.



  3. Learn more about Googlebot's crawl of your site and more! - Vanessa Fox discusses new additions to Google Webmaster tools meant to help the Webmaster track the bot better.



  4. Googlebot activity reports - Google blogger explains how the company tracks the amount of traffic between Google and a given site.



  5. Better details about when Googlebot last visited a page - Vanessa Fox breaks this very confusing subject into excellent detail.

Monday, August 20, 2007

Vanessa Fox Talks In Detail About Googlebot

As a Web developer I'll never cease my quest for the holy grail: understanding just how Google takes all those URLs it crawls and turns them into a tidy little list of serch engine result pages (SERPs).

I know I have a pretty good understanding--relative to that of the layman--but the more I learn the more I find that I don't know squat. But when Vanessa Fox, founder of Google Webmaster Central, authors a blog entitled All About Googlebot, I know I am going to learn something valuable.

I haven't been reading Fox's blog for long. But when I have read something she has written or heard one of her speaches I have found them to be refreshingly open and honest about Google's practices. And while Google may need to protect it's "secret sauce," we Web developers know that a lot of changes the company makes to its algorithms are in an effort to stop black-hat practices. So it's nice to have someone like Vanessa Fox out there to lend some insight to those of us who are doing things the right way and for the right reasons. Here are some Q&As from the recent Search Engine Strategies Conference she shared in her blog:





  1. If my site is down for maintenance, how can I tell Googlebot to come back later rather than to index the "down for maintenance" page?


    You should configure your server to return a status of 503 (network unavailable) rather than 200 (successful). That lets Googlebot know to try the pages again later.



  2. What should I do if Googlebot is crawling my site too much?


    You can contact us -- we'll work with you to make sure we don't overwhelm your server's bandwidth. We're experimenting with a feature in our webmaster tools for you to provide input on your crawl rate, and have gotten great feedback so far, so we hope to offer it to everyone soon.




  3. Is it better to use the meta robots tag or a robots.txt file?


    Googlebot obeys either, but meta tags apply to single pages only. If you have a number of pages you want to exclude from crawling, you can structure your site in such a way that you can easily use a robots.txt file to block those pages (for instance, put the pages into a single directory).




  4. If my robots.txt file contains a directive for all bots as well as a specific directive for Googlebot, how does Googlebot interpret the line addressed to all bots?


    If your robots.txt file contains a generic or weak directive plus a directive specifically for Googlebot, Googlebot obeys the lines specifically directed at it.

Friday, July 6, 2007

Google Employee Gives Advice About Best Uses of Flash

We've all been taught that Google is, in essense, a "blind user" and I had heard that it couldn't search for the content contained in Flash, so I have always recommended against using it in page designs. However, I am hearing that Google is making an effort to search Flash content (or at least the content surrounding the flash design), so when I saw Mark Berghausen's post, "The Best Uses of Flash," I was intrigued. He says:


As many of you already know, Flash is inherently a visual medium, and Googlebot doesn't have eyes. Googlebot can typically read Flash files and extract the text and links in them, but the structure and context are missing. Moreover, textual contents are sometimes stored in Flash as graphics, and since Googlebot doesn't currently have the algorithmic eyes needed to read these graphics, these important keywords can be missed entirely. All of this means that even if your Flash content is in our index, it might be missing some text, content, or links. Worse, while Googlebot can understand some Flash files, not all Internet spiders can.


Berghausen recommends:

  1. Using Flash only where needed: This is a recommendation the great usability expert Jacob Nielson has been touting for ages (Check out his article "Flash: 99% bad." I can't recommend his work enough.)

  2. Using sIFR for to display headers, pull quotes, or other textual elements. I disagree here. As a strong advocate of usability, I don't think that bells and whistles like flash or their counterparts should be used for textual elements for a variety of reasons. One is because of the critical nature of those textual elements to search, especially the header. If a designer uses flash or sIFR to display a header it is not likely that they will display that element again as text because in most cases it will not be aesthetically appealing. But this is what needs to be done for that element to be properly picked up for search. Another reason is that a flash element slows down the speed that the page loads. Visitors today have high demands when it comes to viewing pages, and when it takes even a couple moments for a page to view, or worse the page has loaded and another element or elements is still loading, visitors exit. Additionally, as more and more visitors "information snack" having content available in those first few seconds is critical because those visitors especially are guaranteed to stay on your site for only a few moments before going on to another domain.

  3. Non-Flash Versions: Flash used is as a front page "splash screen" where the root URL of a website has a Flash intro that links to HTML content deeper into the site. This recommendation seems to make sense for the designer who absolutely insists on using flash and the developer who is assured that their audience has the hardware and the internet connection to load the page speedily enough that they won't depart because the page loads so slowly they leave as a result. And becayse the page links to HTML deeper on the site SEO remains intact.