Skip to main content

Be Search Engine Friendly - be seen.

Best practices for producing and managing a search engine friendly website

The following list identifies a foundational set of best practices which, when applied consistently, should result in a search engine friendly website. While we've also thrown in some keyword placement tips and other useful suggestions, this article is not about creating optimized content — it's about setting the stage for successful SEO Search Engine Optimization campaigns.

  1. Keep the site (docroot) shallow. A shallow docroot, around 3 levels deep, will likely make it easier for the spiders to access most files.
  2. Create search engine friendly URLs Uniform Resource Locators. Keep them as short as possible. And, since, search engines may have difficulty indexing dynamically served URLs consisting of complex query strings, consider using URL rewriting scripts.
  3. Create a separate file for each type of service or product. Separate files will facilitate optimizing for each product/service, enabling you to take full advantage of the HTML Hypertext Markup Language elements which matter most for SEO (e.g. <title>, <h1>). Include the subject and separate the words in the filename by hyphens.
  4. Create an XML Extensible Markup Language sitemap to let spiders know which files should be indexed.
  5. Allow Google's spider to access website resources required for rending the page layout such as images, CSS Cascading Style Sheets and JavaScript files. This allows the spider to "see" your website as a user would see it.
  6. Identify and eliminate duplicate content. In the case of duplicate content specify a canonical URL for the preferred file. Also, be sure to set your preferred domain (www or non-www) for Google (login to Webmaster Tools) and set 301 redirects at the server.
  7. Separate web page design/layout and HTML. Write standards-based HTML for structure and implement external CSS for design layout.
  8. Write unique and meaningful HTML titles. Use keywords to accurately describe the web page contents. If your branding is well established, you may want to consider HTML titles that place industry keywords before your company name. At this time, Google displays approximately the first 55 characters. Keep in mind the exact number of characters displayed varies based on the character width and whether the text is bold and/or capitalized.
  9. Provide unique and meaningful meta descriptions. Meta descriptions should be used for all spiderable web pages and should be approximately 150 characters or less. Think of meta descriptions as brief summaries which are sometimes displayed on search engine results pages.
  10. Ensure JavaScript is search engine friendly. Use with caution and make sure spiders are able to crawl JavaScript-driven content. Place alternative, unscripted content in <noscript> elements. Create external files whenever appropriate place these files just before the closing </body>
  11. Only use Flash as a design enhancement. Use flash as you would other imagery — to add interest to specific pages. Avoid using Flash to create an entire website or to create site wide navigation. And, remember some mobile devices will not display Flash.
  12. Avoid the use of frames. If you must use frames, use them with caution because content indexing agents may have difficulty spidering framesets. For example, a search engine may return a link to a single frame of a frameset (i.e. a partial page rendering).
  13. Use descriptive link text. Create link text using relevant phrases describing the link's destination. Avoid using generic text (e.g. "click here"), opting instead for descriptive text (e.g. "learn more about new widget").
  14. Use alternative text for images. Keep it short and descriptive.

Other, Supportive, Best Practices

  • Check link integrity regularly to ensure all links resolve.
  • Create custom error pages.
  • Regularly test cross browser/device compatibility
  • Check coding standards adherence using an HTML validator application or service.
  • Conduct regular spell checks.
  • Conduct regular analysis of web server access logs and retain and archive them for historical analysis.
  • Use a robots.txt file for excluding specific files from search engine indexing. Be sure not to block website resources required for rending the page layout.
  • Produce web pages for your users — never create content which displays for the search engine but not the user.

Suggested Further Reading