How does all this tracking happen? Every one of the services we tested places JavaScript "tags" on each Web page tracked. When a tracked page is loaded by a visitor's browser, this JavaScript service downloads, runs and communicates with the ASP's tracking servers. Tags generally are placed in the header or footer of every page on large sites by content-management software, as it was in our tests.
The services place cookies on visitors' machines to identify them. Cookies provide session, return-visit and similar metrics. How effective are cookies? Very. Vendors say--and we concur based on our tests--that only about 2 percent of the browsing public's workstations have cookies disabled.
This tagging not only identifies visitors, it also helps site segmentation. When an enterprise Web presence comprises various collaborating sites--a common setup--tracking browsing behavior to and from those sites is important. For example, our testing took as a model of complex segmentation the 150 Web sites run by CMP Media LLC, which publishes Network Computing. We measure and manage our site, as do all our sister magazines, but corporate also needs the big picture of overall Web usage. And like any other organization, CMP has divisional perspectives, which might relate to different products, services or, in our case, magazine groups that are monitored and managed. Each, in analytic terms, is considered a segment. Generally, each segment needs a unique tag.
This means that a single site, such as www.nwc.com, must be tracked three ways--alone, as part of a division and as part of the corporate interest--it needs three unique tags. Why not use a single tag for all of CMP and then parse out data for specific sites? The short answer is, each service provider manages its data process, and thus administration and reports, along the boundaries set by the tags. So in this single-tag example, a report of total visitors would reflect the total visitors for CMP. Of course, it seems reasonable to expect that a database query on a unique index, such as www.nwc.com URLs, would sort out data specific to Network Computing. However, this data first must be saved in a raw format for our publication, and then it must be processed just for us. This effort equals more costs to the service providers.
Multiple tags may seem excessive from a client browser point of view, but our tests showed this isn't the case. Using the globally situated Gomez GPN and Last Mile Web performance services, which let us monitor availability and, more important, specific download times of each tag between the Internet and user desktops (including broadband and dial-up last miles), we found tag overhead to be negligible, usually subsecond. (read more on the Gomez service.)