How To Make Your Website Search-Friendly
Search-friendly website architecture ensures that search engines can access your site’s content.
First, a bit of background about how search engines build the indices from which they derive the website listings displayed on their results pages. Google and the other search engines don’t have teams of people who archive every page on the Web. It relies on programs called “spiders” – automated robots that move between links and store the information in a site’s code in their databases.
Making sure these spiders can access your site’s content is extremely important for search engine optimization (SEO). However, many website architecture mistakes can make large portions of your site unavailable to the search engines’ spiders.
Here are four of the most common mistakes, as well as tips on avoiding them.
1. Too much content in image or script files.
The solution is to duplicate the information stored in these alternative formats with text versions – such as putting descriptive, keyword-rich text in the alt tag for your site’s header graphic. Try using a tool such as Webconf’s Search Engine Spider Simulator to observe what the spiders see after arriving on your site. If you notice that chunks of content are missing, provide the excluded information as text elsewhere on the page. Or you can use your site’s robot.txt file to redirect the spiders to specially designed, text-based pages you’ve created to provide them with the same information. The robot.txt file gives instructions about your site to search engines.
2. Deep vs. shallow navigation.
It can be problematic when a site’s navigation becomes too deep. Because search engine spiders move between the pages of your site through the links you’ve created, making this as easy as possible for them is essential. Suppose your navigation structure is deep, meaning certain pages can be accessed after completing a long string of sequential clicks. In that case, you run the risk that the spiders won’t crawl deeply enough into your site to index all of your pages appropriately.
The solution is to implement a “shallow” navigation structure, in which every page on your website can be accessed by both human visitors and search engine spiders within two to three clicks. You can accomplish this task by breaking up your navigation structure into sub-categories or incorporating additional internal links.
3. Inconsistent linking practices.
As you build these links, you’ll want to be careful about how you name them. This is because the search engines can’t apply human judgment to see what you meant to do. For example, their spider programs may index the URLs “www.yoursite.com/contact.html” and “yoursite.com/contact.html” as separate pages – even though both links direct visitors to the same location.
To prevent these indexing errors, be consistent in building and naming links. If you’ve made this mistake in the past, use 301 redirects to let the search engine spiders know that both the “www” and “non-www” versions of your URLs are the same.
4. Failure to include a site map.
As you improve the accessibility features of your website’s architecture, make sure you have a site map in place. This file provides the spiders with an accessible reference of all the pages on your site, allowing indexing to proceed correctly.
Suppose your website is built with WordPress, Joomla, or a similar CMS platform. In that case, you should be able to install a plugin that will automatically generate a site map page for you. If not, creating a site map can be as simple as building a single HTML page with links to all of your other pages and submitting it to the search engines for indexing.
Eager to learn more about search engine optimization? Click here for recommended SEO resources.
Would you like support for your website? Webb Weavers Consulting, based in Ventura, CA, can help! See more about Search Engine Optimization services.