When you don’t know what you’re looking for, anything can look interesting. Retailers should measure what is useful, says Forrester.
When manufacturer Crayola went beyond hunches to professionally designed experiments to test how different e-mail campaign elements affected results, it pinpointed the yield from each combination with a new level of accuracy, according to a new report from Forrester Research Inc. Various combinations of salutation, offers, and closings delivered e-mail campaign returns that varied from 9.7% to as high as 33.7%.
Such designed experiments and conjoint analysis–-research techniques developed for industrial quality control and marketing-–will allow retailers to make the best use of increasing volumes of customer data they’re gathering from web site analytics, says Forrester analyst Bob Catham. “Too often, Forrester clients say they’re drowning in data,” he says, adding that this indicates an inefficient approach to analytics. “When you don’t know what you’re looking for, almost anything can look interesting.”
In addition to using research experts to design experiments to which analytics can be applied, Forrester urges web site operators to start the process with a hypothesis that can be broken down into individual measurable elements. Under that scenario, validation of a site operator’s general observation that customers will perceive that web performance is faster if text loads before images, for example, becomes a round of structured testing. That lets site operators see the impact of multiple factors-–such as server load, content position, color, and font size, all at once, in a statistically valid way, says Catham.
Those charged with measuring data for business units can help stem the tide of useless data by asking that requests for measurement be put in the form of a hypothesis to test, Catham adds. From there, it’s a shorter leap to applying those results directly in the operation of web sites and call centers.