Last Updated on
Effective navigation should let a user know:
- what site they are on;
- where they are in that site;
- and where they have been.
Navigation and Search Engines
Good navigation helps search engines better understand the site structure as well as helping site users. Typically your most important documents will have the greatest number of inbound links.
Often, people will use tabs or images for their links which have a minimal amount of descriptive text in them. You can offset this by using descriptive text links in the page footer. You can see an example of this technique at http://www.search-marketing.info. It is common to have one set of navigation that is used by site visitors and another that is used by search engine spiders.
Proper navigation also gives you descriptive internal links. A popular technique for doing this is using bread crumb navigation.
Heres how a basic web site navigation should look like:
- The first link would be a link to the home page.
- The second link would be to the chapter on search engine optimization. These links would be optimized text links which help define the purpose of my pages.
- The third piece of text would not be a link but would just be text saying the page where the user is.
Setting up navigation looks professional, helps the user, and helps search engines understand the relationships between pages on your site. It also gives you better usability and higher rankings. You can’t beat that with a stick!
you feel you must use it, make sure you add static text links to the bottom of your
SiteMaps (Part 1)
It is also a good idea to have a sitemap linked to from the home page which links to all major internal pages. The idea is to give search engine spiders another route through your site and to give users a basic way to flow through your site if your navigation is broken or confusing. You can also use the sitemap to channel link authority and promote seasonal specials. The sitemap should be:
- quick loading;
- light on graphics;
- and overly simplistic.
Xenu Link Sleuth checks for broken links and can also help you quickly build a sitemap.
SiteMaps (Part 2)
The original definition of sitemaps was a way for both users and search engines to be able to find content on your site. Near the end of 2006 search engines also created an xml based sitemap protocol which can also be used to help alert search engines to new content on your site and the relative importance of each page, but please note that actively linking within your main site structure and building inbound linkage data will carry far more weight in helping engines find and rank your content.
This is not something most webmasters need to worry about, but some large catalog sites organize items by genre and then list choices alphabetically. If you have a vast number of related choices and are creating a navigation route that is more likely to be useful to bots than humans, you may want to link to all of the choices on one page or provide links to each of the additional pages near the bottom of the first page.
If you only have one ‘next link’ on each page, then each time a spider indexes a page, you are sending them to a page with less and less link popularity.
This may not be a big deal if you have other paths for spiders to search through your site, but if this is a primary indexing mechanism, you cannot expect them to spider through 25 consecutive pages of items starting with the letter S if they only get one of those links at a time until they index the next page.
Entry Pages that Convert
Poor pagination and other similar problems sometimes cause large dynamic sites to waste much of their link authority on pages that provide search spiders with little unique content or value.
If these pages rank in the search results over more focused pages on your site, then you may have a much lower conversion rate than would otherwise be attainable.