Creating a Crawlable Regional Website

Creating a Crawlable Regional Website=

Credit: Flickr

One of the challenges of having a large website that has content about multiple regional areas is getting all of the content crawled on a regular basis. Due to the width and depth of sites like this, getting new content that is at least 3-4 levels deep is often a problem. In this post I’ll be taking a look at some strategies to help you solve these issues.

Let’s look at the first problem: getting all of the content crawled and in the index. For the sake of simplicity I’m going to be using the 50 states model, but the concept is easily adapted to other types of structures. My number one recommendation is to put a link to each of the 50 states on the home page. When I say something like that, the most common response that I hear is, “Great! We’ll put it in the footer of every page.” But that’s not really the best solution. Google and other search engines look at pages and try to break them down into sections and, in some cases, weight things differently. For example items listed in the top navigation have a greater likelihood of becoming sitelinks for your site. It’s not a 100% guarantee, but it does help, especially if the pages have a lot of links to them. They will also look at things like sidebars and footers and weight them differently.They also look for content that is repeated on every page, and again it’s often weighted differently than information and links in the main content section of a page. My recommendation is to put a link to each of the 50 states right in the main content area of the homepage but to avoid putting it sitewide in the footer. What I would also do is create a separate page (a mini sitemap if you will) with links to each of the states and put a link to that page in the footer or masthead, whichever makes the most sense.

When you are working with large sites like this, using a breadcrumb is essential. Not only does it help advanced users know where they are in the structure of your website, but it also provides crawling points. So all of your pages should have something like this:

Home > State > Page
Home > State > City > Page

It’s important that the “state” and “city” elements be hyperlinked back to another mini sitemp page. Everything you can do to make it easy for the search engines to get exposed to the content is beneficial. The less pages a search engine has to go through to find the content the better.

Another challenge with these sites is getting new pages in the index. When a page is in a state and city directory tree, it’s going to be 3-4 levels deep under the best circumstances, and 5 or more if things are really bad. To solve that problem you want to list the new content as close as possible to the homepage for a short period of time to spoon feed the search engines. If you can, list the 10, 15, or 20 newest pages right on your homepage. If you are using a CMS, have some custom programming created to take care of that for you automatically, so it’s a hands-off operation. I’d also create a “what’s new” page with the 100 newest pages listed on it. Another tip: on the state and city mini sitemap pages, create a new listings section area at the top and put the 10 newest sub pages there as well. Again, try to get some code in place to automate the process as much as possible.

Lastly if your listings get updated regularly and you want to make sure the content gets indexed quickly, create a recently updated section that works just like the what’s new areas I mentioned above.

via flickr cc


Enjoyed this Post?

Recommend it to friends and colleagues!
             

no comments posted.

Leave a Reply

Your email address will not be published. Required fields are marked *