Tips for Improving the Crawlability of Your Website

Tips for Improving the Crawlability of Your Website=

Credit: Flickr

The best website in the world is going to have a hard time ranking if a search engine can’t navigate through it easily. Here are some tips to help you make your website more crawlable.

Sitemaps – Having a sitemap is the number one thing you can do to make your website easier for a search engine to crawl. It flattens out the architecture and exposes a large number of links. You should have an HTML sitemap and an XML sitemap: they serve different purposes. Don’t rely on only an XML sitemap because search engines prefer content they discover through natural crawling instead of just through a sitemap. Google recommends no more than 100 links per page; although you can technically put more than the recommended number, 500 links on a page isn’t good for bots or users. If you have a lot of links, use multiple sitemaps.

Flatten Your Website – Unless you have a trusted authority website, you need to be concerned about how you use your link equity. Having 1000 really short articles about similar topics doesn’t work as well as 100 high quality longer articles. Look for ways to eliminate or consolidate pages that don’t add value. If you are concerned about too much information on a page, look into CSS and layers to control the presentation of info but to keep it exposed for search engine spiders.

Expose Links Avoid Link Chains – While I personally don’t like reading paginated articles, the truth is they aren’t good for search engines either. They spread the content over multiple pages, each with similar titles and poor internal anchor text (ie “more”, “continued”, or a page number). So you should avoid it whenever possible. If you do need multiple pages, such as for something like multiple sitemaps, expose all of the links at once. Show me the links to page 3,4,5, and 6. Don’t make the only exposed link to page 3 on page 2

Use Breadcrumbs – Using breadcrumbs is a really excellent way to expose links to mid level pages. Many sites will just list the hierarchy, but if you turn the words into hyperlinks you’ll get more out of internal anchor text. The text can be small and still be useful without radically affecting a design or layout

Interlink Within the Main Content Area – Search engines are getting much better at breaking down pages into different elements, separating out common navigation from content, and giving the areas different importance. Use this to your advantage by putting links to other pages in your site in the main content area wherever possible. Use things like suggested items or related posts to make your site more sticky.

Avoid Renegade Linking Implementations – Developers like to build cool and neat things in websites; however, that javascript or ajax menuu that looks really cool might not be crawlable by search engines. Search engines are getting smarter about navigating that type of content, but you can’t depend on it. Ask yourself the following question: is it ok to have an amazing ajax flyout menu that no one will ever see because search engines can’t crawl the site? Use straight links wherever possible to ensure maximum crawlability.

Test Crawling – Use a search engine spider or something like Xenu Link Checker to make sure your website can be crawled. The technology isn’t exactly the same as a search engine uses, but it will help you spot any problem spots if they do exist.

via flickr cc


Enjoyed this Post?

Recommend it to friends and colleagues!
             

One Response

  1. Pingback: 4 Aspects of HTML5 You Should Use for Better SEO

Leave a Reply

Your email address will not be published. Required fields are marked *