SEO Question: How do I improve or increase the crawl rate of my website?

So a new question which has come is “How do I improve or increase the crawl rate of my website?”

This is a great question and there are some really easy steps to follow.

1. Sitemaps – Something I always recommend is when you make a change on your website, even if it’s just a single character - update your sitemaps and submit them to Google Webmaster Tools. By having a Webmaster Tools account you can easily notify Google of any website changes.

2. Gain backlinks – Gaining further backlinks from websites, blogs etc can increase your crawl rate. Imagine when Google is crawling millions of websites and they kept finding links back to your website, this will encourage them to visit your site more often.

3. Add fresh content – Adding content on a regular basis will encourage the search engine bots to crawl your site on a regular basis. I have seen sites which haven’t been update in ages and the gap between crawls has been well over 4 weeks.

4. Use Ping-o-Matic – Apart from submitting sitemaps to Google. I like to use the tool http://pingomatic.com/. I have found this works really well when it comes to telling Yahoo and Bing about your site.

5. Server uptime – If you have problems with your server and website uptime, this can also affect your crawl rate, especially if Google comes to crawl your site and it is down.

6. Page speed – Google has a set amount of time it spends on a site and if webpages are taking to long to load, this can affect how many pages are crawled. Google has a great metric in the new Analytics interface which gives more details on site speed.

7. Social elements – I have found by adding social streams (or twitter feeds) to a website can increase crawl rates, especially if you are a regular user on social sites like Twitter. The increased crawl rate is caused by the regular changes which are being streamed on to the website.

8. Crawl speed – In Google Webmaster Tools you can increase and decrease crawl speed via the settings tab.

9. Broken internal links – Make sure you have no broken links within your website, sending the Google bots off to a 404 page may cause confusion and force the bots to abandon the crawling process.

10. Duplicate content – Always make sure there is no duplicate content on the website. Again, Google has a sent amount of time on a website and may miss those unique and quality web pages.

This post was written by , Dave is a digital marketer specialising in SEO and PPC, and can be followed on Twitter, Facebook, LinkedIn and Google+.

  • Categories

  • Archives