A “valid” site is not always the best site for users that visit it. Even amongst the savviest of coders and developers there has always been a common misconception about the value of web standards themselves. The idea of “keeping it super simple” (or other popular variations), when it came to the world of markup once revolved around spacer images and table-based presentation oriented markup. It seems that either as a beginner or a seasoned web professional the role of standards themselves became overrated, since even the less markup of yesteryear still validated. The balance of the confusion over the value of standards begins with the fact that web standards are not consistently supported amongst popular user agents, why should we bother working with them — why all the fuss? Regardless, the true value of web standards is as a stepping stone and the leverage it contributes to a well-conceived web site inside and out.
Think Outside the Design
The value of web standards really amounts to recommended use of markup to semantically describe content. Once mastered, the web developer is able to make intelligent and conscious decisions on the “right” compromises to be made for a given project. We are constantly working towards standardization and have had dialogs about the best practices for markup in various situations, it’s the World Wide Web Consortium’s role to define the purpose of markup; the platform for web site optimization. Web site optimization has little to do with search engine optimization or any of the W3C’s validation tools. Instead web site optimization deals with steps taken to improve user experience by:
- reducing page weight
- re-factoring of markup, CSS and/or Client Side Scripting
- making content accessible
- making content semantic
- reusing imagery
- optimizing the weight of imagery
- caching and deferred loading
- reducing latency to reduce download or render time
In short, the goal is to use the minimum code to achieve the desired result. Unfortunately, clients may not always afford us the proper time or resources required to give the most polished result possible.
Think it Through
Web standards in and of itself does not necessarily contribute to reduced file sizes, however what it does do is endorse healthy use of semantic markup that does give way to reduced page weight through table-less markup and a focus on cascading styles sheets for presentational material. By using document object model scripting, procedural code no longer needs to live inline in the html document itself. Take advantage of your page’s semantic structure to use the DOM to the fullest.
Code becomes art when we take our code to the next level by re-factoring it to maximize it accessibility, by reducing our dependency on the markup for presentation and procedural user interface components. What remains to be done when all of the content in a document is rendered as the design calls for, content properly described with your tags, images optimized for reuse and weight? Now, we consider scale, what happens when this site we’ve worked so hard to optimize becomes highly trafficked (think: Digg Effect) — or if the site already is, let’s make sure to optimize the server’s role in the user experience.
Caching is one of the chief techniques to be leveraged to improve user experience both on the client-side and the server-side. Making objects like cascading style sheets and JavaScript files external can also benefit from the technique of combining files to reduce latency. It’s much less “work” to download a larger file once than it is to download (or check for freshness of) several files. Unfortunately, many of the most visited sites could benefit greatly from even a dash of web site optimization. Issues like multiple CSS file or JavaScript files demonstrate little regard for the benefit they could provide their visitors as well as their own bottom line.
Move on to compression; consider pre-compressing your CSS and combined JavaScript files to reduce server load for high traffic sites. Go a step further and create a proxy that makes sure to return the “not modified” codes to user-agents checking for freshness of objects in your site after first download.
Without getting into code for each portion, let’s consider the typical components of a “well-designed” HTML document:
- masthead
- navigation
- breadcrumbs
- body
- sidebar
- footer
Within each there are a myriad of possible methods to semantically describe the content of the components. Let’s have a look at a few basic cases:
- Unordered Lists for navigation, breadcrumbs and copy in list items.
- Non-tabular layout for forms and use of labels and access keys for accessibility
- Use of <p>, <em>, <strong>, <dl>, <h*>, <table> tags for content
Diving into a single common challenge can show how understanding of web standards cascades into an optimized user experience, let’s look at a technique that combines several techniques by several authors, each of which contributing to many fundamental factors of web site optimization; specifically: image reuse, semantics, presentational separation, caching, latency reduction, image optimization, and accessibility/platform independence. Anyway, on to the challenge — image based main navigation with hover effects. Without being distracted with pseudo-code let’s have a look at how using what we know about web standards leads naturally to web site optimization and a very desirable result for the user:
- Start with an unordered list, in the case of drop down menus, let’s make that a nested unordered list
- The unordered list is styled as required using CSS such that any copy is moved out of view by hiding overflow and indenting the copy out of view of user agents that support CSS, but still leaving it accessible to screen readers etc
- Now imagery is added for each of the tabs for the various states (hover, visited, active etc) as necessary
Normally this is where things would end. At this point we have the desired result, but it’s not an optimal experience for the user. Again to the credit of numerous designers and developers turned authors out there additional techniques can be applied to optimize the menu quite a bit:
- Combine all of the images for each button in the navigation into a single file
- Combine all of the image states the navigation into a single file and use CSS to shift the desired portion of the image into view when required
- Put any JavaScript required for desired effects; e.g. transparency, sliding effects support for browsers that don’t support standards as we would like etc an external file
In the previous three steps, we’ve:
- Reduced the latency required to load the main navigation imagery and the overall render time for a given page
- “Pre-Loaded” and cached the other anchor states for the navigation without using any client side scripting
- Cached the JavaScript for the navigation by making it external (the same is obviously true for the CSS), improving the render time for subsequent page views
Now apply a few more techniques to the site as a whole:
- Take advantage of the compression support of popular browsers and compress JavaScript and CSS so that it can be sent instead of the larger uncompressed versions
- Combine our CSS files and JavaScript files respectively, similar to the combining technique for the navigation imagery to reduce latency
Cache these compressed versions of the combined files on the server so that
- Cache these compressed versions of the combined files on the server so that every page view requested doesn’t require the web server to have to prepare the same files over-and-over on-the-fly. Instead the server can send static files immediately (which it can do with tremendous ease).
With the various techniques we all apply to our projects just adding a few more steps of optimization greatly improves the user experience.
Make it Your Own
Standards simply help us agree on what markup is intended to do and how it’s elements work together for describing content, web site optimization picks up where web standards leaves off. The W3C encourages us to use markup to describe the content and separate the presentation and functionality from markup as much as possible. Once we get used to the idea our time is best spent optimizing our code to work in the real world. I’ve intentionally left out the “how” because that’s an ongoing debate whose conclusions are at best situational. There are quite a few frameworks out there that help developers apply many of these principles to their projects right out-of-the-box, but it’s not too difficult to build your own framework for your own style of work.
So what’s the final word? Well, similar to the stance that Ethan Marcotte put forward I suggest that web standards be the baseline that we use to optimize sites to perform for the targeted user agents. One day it may be easier to leverage standards to achieve a predictable user-experience across all user-agents, but for now it’s best to have more skills and mastery than are required to render a job well done.