Controlling where search engines go

I finally had reason to create a robots.txt file. This file implements a protocol used to tell search engine spiders to ignore certain pages on a website. I noticed that google was turning up a bunch of the meta-pages that are in the Sillysoft wiki, which was not desired. There are multiple meta-pages for every real wiki page, so they were drowning out the real content. By telling search engines to ignore the undesirable pages it focuses them on the pages that actually have useful content on them.

Posted by dustin on December 24, 2004 with category tags of

   

VorgTag Cloud

Written by dustin
Latest Photo
Quote of Now:
Friends
Popular Posts
Computer Games

Hey You! Subscribe to dustin's RSS feed.
Or get wider opinion in the Vorg All Author feed.

 
 

Members login here.
© Vorg Group.