In order to analyze the content on your web site and figure out which pages to display (and in which order) for certain search terms, the search engines (SEs) use a program called a web crawler. You might have also heard these referred to as spiders or web-bots.
Every SE has its own web crawler program and its own formula for ranking pages. However, while page factors might be ranked differently by different engines, the idea behind the web crawlers is basically the same. Getting your site crawled often is a good indicator that the SE finds your site important. And since important sites get better rankings, getting your site crawled frequently is definitely a good thing.
While there is no way you can make the SE crawl your site, there are a few things you can to do encourage the spiders to visit your site more often:
- First and foremost, keep      your content up to date. Add unique content often and on a      regular basis. And be sure to ping      Google each time you update your site.  
- Get as many backlinks      as possible from other relevant sites that are crawled regularly. There      are several tools available that will track and measure the crawl rate of      other websites.  
- Use a unique title tag      for each page. Also, while not as important as the title, each page should      also have its own keywords and description tag. This will help the spiders      "know" that each page is a unique and individual entity.  
- Keep your pages as small      as possible. The spider will not spend an unlimited amount      of time at your web site. If it's held up by your huge images, monstrous      PDFs or even excessive text, it will simply abandon the page and move on.       
- Pay close attention to your internal links and avoid any duplicate content      issues. If you have multiple pages with the same content (often used for      testing and/or tracking purposes) be sure to use your robots text file to      keep the SEs away.  
- Create a sitemap      for the content you want crawled and make sure your content is compliant.       
- Monitor your web server. Obviously, your server needs to be up and running      when the spiders come a-crawling. There are many commercial providers who      will monitor your server and notify you when there is a problem. In      addition, ensure that your server handles all of your error pages correctly. Refer      to the Google Webmaster Tools report of unreached pages for an idea of      what errors are being returned.  
- And while we're discussing Google Webmaster Tools, don't forget      to use it to monitor your crawl      rate, adjust it if necessary, test and track and see what      works best for your site.
Time to implement: As with all SE optimization, working toward an optimal crawl rate is not a one-time job. You will need to add content approximately three times a week and update your site map as needed. Monitoring your web server is an ongoing task.
Karen Scharf is an 
 

No comments:
Post a Comment