Blog

Frederick Townes

31bf6f36-3701-11e4-8034-feeb6e355a50

Frederick is a widely published blogger covering various topics including: internet marketing, social media, web hosting and web design and development. His passions are diverse covering the breadth of topics above, but for the last two years his focus has really been on social web applications and sites. A native of Milwaukee, Wisconsin, Frederick is a graduate of Boston University where he studied Computer Science and Mathematics. Founded in 2003, W3 EDGE is among many web-based businesses started by Frederick. Some of which can be found in the work section of this site.

All topics by Frederick Townes

25 Tips to Increase Conversion Rates

A lot of site owners spend a lot of money on SEO. But once you have traffic, then what? How do you entice visitors to make a purchase? That’s where conversion optimization comes in – converting visitors to buyers. Here are 25 low- and no-cost tips used by the pros to boost conversion rates. Try them. You’ll like what they do for your bottom line.

K.I.S.S. Your Way to an Optimized Site

A “valid” site is not always the best site for users that visit it. Even amongst the savviest of coders and developers there has always been a common misconception about the value of web standards themselves. The idea of “keeping it super simple” (or other popular variations), when it came to the world of markup once revolved around spacer images and table-based presentation oriented markup. It seems that either as a beginner or a seasoned web professional the role of standards themselves became overrated, since even the less markup of yesteryear still validated. The balance of the confusion over the value of standards begins with the fact that web standards are not consistently supported amongst popular user agents, why should we bother working with them — why all the fuss? Regardless, the true value of web standards is as a stepping stone and the leverage it contributes to a well-conceived web site inside and out.

Think Outside the Design
The value of web standards really amounts to recommended use of markup to semantically describe content. Once mastered, the web developer is able to make intelligent and conscious decisions on the “right” compromises to be made for a given project. We are constantly working towards standardization and have had dialogs about the best practices for markup in various situations, it’s the World Wide Web Consortium’s role to define the purpose of markup; the platform for web site optimization. Web site optimization has little to do with search engine optimization or any of the W3C’s validation tools. Instead web site optimization deals with steps taken to improve user experience by:

  • reducing page weight
  • re-factoring of markup, CSS and/or Client Side Scripting
  • making content accessible
  • making content semantic
  • reusing imagery
  • optimizing the weight of imagery
  • caching and deferred loading
  • reducing latency to reduce download or render time

In short, the goal is to use the minimum code to achieve the desired result. Unfortunately, clients may not always afford us the proper time or resources required to give the most polished result possible.

Think it Through
Web standards in and of itself does not necessarily contribute to reduced file sizes, however what it does do is endorse healthy use of semantic markup that does give way to reduced page weight through table-less markup and a focus on cascading styles sheets for presentational material. By using document object model scripting, procedural code no longer needs to live inline in the html document itself. Take advantage of your page’s semantic structure to use the DOM to the fullest.

Code becomes art when we take our code to the next level by re-factoring it to maximize it accessibility, by reducing our dependency on the markup for presentation and procedural user interface components. What remains to be done when all of the content in a document is rendered as the design calls for, content properly described with your tags, images optimized for reuse and weight? Now, we consider scale, what happens when this site we’ve worked so hard to optimize becomes highly trafficked (think: Digg Effect) — or if the site already is, let’s make sure to optimize the server’s role in the user experience.

Caching is one of the chief techniques to be leveraged to improve user experience both on the client-side and the server-side. Making objects like cascading style sheets and JavaScript files external can also benefit from the technique of combining files to reduce latency. It’s much less “work” to download a larger file once than it is to download (or check for freshness of) several files. Unfortunately, many of the most visited sites could benefit greatly from even a dash of web site optimization. Issues like multiple CSS file or JavaScript files demonstrate little regard for the benefit they could provide their visitors as well as their own bottom line.

Move on to compression; consider pre-compressing your CSS and combined JavaScript files to reduce server load for high traffic sites. Go a step further and create a proxy that makes sure to return the “not modified” codes to user-agents checking for freshness of objects in your site after first download.

Without getting into code for each portion, let’s consider the typical components of a “well-designed” HTML document:

  1. masthead
  2. navigation
  3. breadcrumbs
  4. body
  5. sidebar
  6. footer

Within each there are a myriad of possible methods to semantically describe the content of the components. Let’s have a look at a few basic cases:

  • Unordered Lists for navigation, breadcrumbs and copy in list items.
  • Non-tabular layout for forms and use of labels and access keys for accessibility
  • Use of <p>, <em>, <strong>, <dl>, <h*>, <table> tags for content

Diving into a single common challenge can show how understanding of web standards cascades into an optimized user experience, let’s look at a technique that combines several techniques by several authors, each of which contributing to many fundamental factors of web site optimization; specifically: image reuse, semantics, presentational separation, caching, latency reduction, image optimization, and accessibility/platform independence. Anyway, on to the challenge — image based main navigation with hover effects. Without being distracted with pseudo-code let’s have a look at how using what we know about web standards leads naturally to web site optimization and a very desirable result for the user:

  1. Start with an unordered list, in the case of drop down menus, let’s make that a nested unordered list
  2. The unordered list is styled as required using CSS such that any copy is moved out of view by hiding overflow and indenting the copy out of view of user agents that support CSS, but still leaving it accessible to screen readers etc
  3. Now imagery is added for each of the tabs for the various states (hover, visited, active etc) as necessary

Normally this is where things would end. At this point we have the desired result, but it’s not an optimal experience for the user. Again to the credit of numerous designers and developers turned authors out there additional techniques can be applied to optimize the menu quite a bit:

  1. Combine all of the images for each button in the navigation into a single file
  2. Combine all of the image states the navigation into a single file and use CSS to shift the desired portion of the image into view when required
  3. Put any JavaScript required for desired effects; e.g. transparency, sliding effects support for browsers that don’t support standards as we would like etc an external file

In the previous three steps, we’ve:

  1. Reduced the latency required to load the main navigation imagery and the overall render time for a given page
  2. “Pre-Loaded” and cached the other anchor states for the navigation without using any client side scripting
  3. Cached the JavaScript for the navigation by making it external (the same is obviously true for the CSS), improving the render time for subsequent page views

Now apply a few more techniques to the site as a whole:

  1. Take advantage of the compression support of popular browsers and compress JavaScript and CSS so that it can be sent instead of the larger uncompressed versions
  2. Combine our CSS files and JavaScript files respectively, similar to the combining technique for the navigation imagery to reduce latency Cache these compressed versions of the combined files on the server so that
  3. Cache these compressed versions of the combined files on the server so that every page view requested doesn’t require the web server to have to prepare the same files over-and-over on-the-fly. Instead the server can send static files immediately (which it can do with tremendous ease).

With the various techniques we all apply to our projects just adding a few more steps of optimization greatly improves the user experience.

Make it Your Own
Standards simply help us agree on what markup is intended to do and how it’s elements work together for describing content, web site optimization picks up where web standards leaves off. The W3C encourages us to use markup to describe the content and separate the presentation and functionality from markup as much as possible. Once we get used to the idea our time is best spent optimizing our code to work in the real world. I’ve intentionally left out the “how” because that’s an ongoing debate whose conclusions are at best situational. There are quite a few frameworks out there that help developers apply many of these principles to their projects right out-of-the-box, but it’s not too difficult to build your own framework for your own style of work.

So what’s the final word? Well, similar to the stance that Ethan Marcotte put forward I suggest that web standards be the baseline that we use to optimize sites to perform for the targeted user agents. One day it may be easier to leverage standards to achieve a predictable user-experience across all user-agents, but for now it’s best to have more skills and mastery than are required to render a job well done.

Link Baiting with Tools

I was fortunate enough to be a guest for the first time on The Alternative hosted by Jim Hedger and Dave Davies and we explored the who, what, why, where and when of building tools and the purposes of link bait. We also let the cat out of the bag about a few tools we’ll be releasing shortly. Feel free to check it out and enjoy. Also in on the session was Jeff Quipp from Search Engine People who’s also running a very exciting contest (with a $1,000 prize) that I’m encourage everyone to participate in!

Google Analytics Steps Up

While the free tools that Google provides are seen as questionable at best in the “don’t be evil” debate (about Google’s ultimate intentions and uses for the data). One thing is definitely clear, webmasters appreciate the added insight into the goings on of their site, enjoy the interface provided into their data and ultimately, Google’s tools and their integration of them (namely Google Analytics and Google AdWords) is definitely *convenient*. Some oldies some newies, but all relevant; Google Analytics steps up with:

  • Easy Implementation
  • Keyword and Campaign Comparison
  • Create Custom Dashboards
  • AdWords Integration
  • Trend and Date Slider
  • E-commerce Tracking / Funnel Visualization
  • Email reports
  • Improved Site Overlay / Heat Mapping
  • Improved Traffic Segmentation e.g. GeoTargeting

All of these are extremely handy especially when you have the ability to track the results on your own to make sure everything is kept honest. I’m looking forward to a hands-on test drive. Check out the tour now.

5 Principles to Maximize Conversion Rate & Usability

Dan Thies over at SEO Research Labs has pointed out a remarkable video by Andy Edmonds. He and his team have used statistical analysis to study how the eye and brain process information while interacting with web sites!

First a definition:
“Foveal View” — The area of visible space where the user is best able to focus with maximum detail. The point here is that outside of the focal area the eye (and therefore the mind) is not perceiving color nor as much detail. Understanding this concept cascades into the takeaways that follow.

Now some highlights from Andy’s portion of the video + my two cents:

  1. The traditional marriage to 800×600 optimized design is really on it’s way out (as many people have noticed looking at their site statistics). Instead wider screen layouts not only bring more content above the fold, reducing the amount of scrolling required to use a page, but they also compliment a user’s natural behavioral desires while using a site,
  2. Page elements should be organized in such a way that relevant blocks of information are near each other so that the brain can make logical associations and accurately assess relevance while scanning a page,
  3. “Information Blocks” should be wider than tall for easiest consumption — again this is in step with the wider layout point above,
  4. Typography & whitespace use (contrast) are also as important as ever; when properly used they create a guide to lead the eye through blocks of content in the body of a page or in navigational areas.
  5. Group navigation items to contain 7 +/- 2 options per group. This avoids forcing the user to stop and process the information. In other words, use this principle to create at-a-glance usability in your navigation, which is vital to conversion.

Heat mapping sites like the following are useful in understanding the result of the eye/brain interaction. Use the insight above to review your design and your heat map results to identify problem areas in your user interface design. Here are some popular tools:

Let’s not forget Google Analytics (Urchin) is useful when using the "Site Overlay" view in also seeing which anchors are most clicked in your site.

However, what we’ve long called “EBO” or Eyeball Optimization is explained masterfully by Andy — Well done!

Since I’m not sure how long that video will be in place so here’s the “permalink.”

Keys to Consistent CSS

Eric Meyer has done it again (yes I’m a cult follower). It was awesome to sit through the live walk through of most of the principles that Eric presented in his final version.

What Eric has decided to do with the support of many interested participants is create a baseline for many of the HTML elements that behave inconsistently from browser to browser. The result being a fantastic snippet of code that removes the subtleties that often cause anomalies in the render of pages in Internet Explorer 6/7 (and in other browsers too).

For those that just want to see the code:

You can see that nearly every element is considered above and is “reset” to values to provide sure bedrock for styling a document.

I suppose I should go to mention another great tip from Eric, while on the topic of consistency and this one points to to consistency between the CSS “functionality” of internet explorer 6 and internet explorer 7. Dean Edwards put together great javascript code which enables coders to focus on CSS production for IE 7 and not have to worry support for behavior that doesn’t exist in IE 6 — definitely worth a look.

Interweb Evolution

Many of you out there have seen this already, but I had to point to something at good old you tube that’s simply well done and insightful. With all of the confusing content out there and controversial definitions, it’s great to be able to sit back and watch the story of the interweb evolution unfold in such a meaningful presentation (it reminds me quite fondly of the evolutions web designers themselves made as we embraced web standards and CSS based web design). Check it out below or at YouTube.