We depend upon standards and conventions all of the time. Often we don’t even realize that we do at least until they are not followed or understood. In the U.S., a raised hand, palm facing out means stop. In Iraq, it means “greetings,” “hi,” or “hello.” When a U.S. soldier wants a car with Iraqis to stop and raises his hand, the Iraqis think he is waving hello and keeps on driving, often to their detriment.
In accounting there are definite conventions regarding the classification of money and resources. Revenue and expense, asset and liability accounts are clearly differentiated. These conventions are essential for meaningful financial measurement and reporting. Executives that don’t follow these conventions can, as Enron executives found out, end up in jail. While not having or not following conventions and standards in web measurement won’t get you killed or land you in jail, it will make web measurement and analysis more difficult and prevent you from maximizing its benefits.
Except for some very general measures such as page views and sessions (even these are not always the same) there are generally no accepted web measurement standards and conventions across web sites. Even in a single web site there are more often than not no clear, well followed standards. Looking at URL names is frequently not unlike an archeological excavation. If directory “x” was used, it was when Joe was web master. If directory “y” was used, it was when Jane was web master. What’s wrong with that? It makes some of the most meaningful analyses difficult if not impossible.
For sites with large number of pages, most big sites, analysis on an individual page basis, except for pages like the home page, is not meaningful. There are just too many pages. What is meaningful, however, is analysis by content group. An online brokerage site wants to know whether information on fees is more effective than information on trading tools in getting prospect to sign-up. If all the “fee” pages are not in a single directory and all the “tool” pages in their own directory, it is difficult if not impossible to compare “fee” and “tool” performance, especially if measurement analysts are not informed when a new page is added that does not adhere to a standard.
And then there are links. If you don’t know how links are used, you haven’t a chance of knowing how well or how poorly your site’s navigation is working. More importantly, you don’t know how to improve it. I’ve seen sites where all of the links on a page were named “View Details,” impossible to differentiate, impossible to improve.
Because most major web metrics tools are based on actual names (URL, links, etc.) it is difficult to retrofit a measurement on a site without a naming convention. On the other hand, because web 1.0 sites do have names for their pages and links, there is at least something that can be measured even if it is not optimal.
With web 2.0 rich-internet-application (RIA) techniques such as Ajax, or Flash, if nothing is done to implement measurement, there will be no measurement. A site implementing a RIA solution is starting with a clean measurement slate that will be empty if nothing is done. So with web 2.0 you must take some action if you want measurement. There is no default, not even a bad one. If you have to do something, you might as well do it right.
What’s right? First of all follow the KISS (keep it simple s…..) principle. Establish simple, clear standards / conventions for naming and categorizing. If they’re not simple and clear, no one will follow them. Make sure they are flexible enough to measure what’s needed today, but also flexible enough to accommodate future requirements (future requirements that are not yet known). Make sure that everyone involved with your web site from designers to programmers to QA is aware of, buys into, and commits to the standards. Make sure that there is a process in place that supports their use.
While every site will have different requirements there are a few basic things that need to be tracked.
- Base pages. In web 1.0 almost everything a viewer saw was a page. In web 2.0. Pages are more like platforms on which events take place. So while there are potentially not as many pages, they are important. An event that takes place on one base page needs to be differentiated from an event that takes place on another base page;
- Events. Events nare actions that take place on a base page without going to another base page. Event names will have two parts. First, the name of base page on which it occurred. Second, a name that is descriptive and consistent across the entire site. For example, deleting a column in a report would have “delete” in the name as well as the column being deleted;
- Links. Links are actions that take viewers from one page to another (either on the site or off the site). The link name should consist of the from-page name (the page on which it originates), a to-page name (the page to which it takes a viewer), as well as link name that is descriptive and consistent across the entire site. The combination of these three elements should be unique across the site;
- Content Area. All pages should belong to a content area so that they can be meaningfully combined and compared and usage statistics correctly aggregated. Content areas can be either hierarchical or flat depending on site requirements. (For a more detailed discussion of content types checkout Functionalism at http://www.semphonic.com/functionalism).
I’m sure there are lots of things I’ve forgotten, overlooked, and omitted. I welcome all comments and suggestions. There is still a lot of thinking and work to be done to effectively measure and analyze web 2.0 sites.
Joel,
I think this is right on the money. The all-Flash sites I've worked on make almost no use of the out-of-the-box "Popular Pages" reports or "Link" reports -- or even referrers or time on site, which are warped by the Flash phenomenon -- because they're meaningless or flat-out wrong. All real measurement happens in custom variables where a rigid hierarchy of action-events had been planned well ahead of time and (in most cases) implemented and QA'd sufficiently. But I think it might be hard to establish a standard of what this hierarchy might be for every Flash site, since they are all so different. I guess KISS is probably the best we can do...
Posted by: paul legutko | October 22, 2007 at 03:29 PM