Its reported acquisition of mobile point-of-sale service provider GoPago points in that direction. GoPago would give Amazon the technology to compete with other players ...
From Start to Finish
Spotting hurdles in shoppers’ way, performance testing holds providers accountable
In terms of site performance, it looked like FootSmart.com was firing on all cylinders. According to monitoring from the backbone–the Internet’s distribution network-availability at the e-commerce site of Benchmark Brands Inc. was 100%, with full pages loading in under 1.5 seconds.
But Gavin Galtere, Benchmark’s applications development and network operations director, had concerns about what was happening at the user end, given a recent site redesign that had loaded pages with a lot more data.
A switch to web performance monitoring services from Gomez Inc. that included so-called last mile monitoring-a service Benchmark hadn’t been getting from its previous provider-confirmed Galtere’s suspicions within a day of implementation: Initial home page availability for visitors on dial-up connections was only 82%.
With as many as 65% of FootSmart’s customers reaching the site via dial-up, that represented a significant potential loss of revenue. Last-mile monitoring also showed that visitors accessing other entry points to the site via search were successful on initial log-on only 92% of the time. The findings were enough to loosen funds from senior management to address the problem; ultimately, through the services of an outside content delivery network. “I was able to take those reports, put them in front of our CEO and VP of operations, and say, ‘This is what I have suspected and here’s the proof,’” says Galtere.
FootSmart.com’s experience is just one example of the evolution in how online retailers are evaluating their customers’ experience by measuring the performance of the technology that supports it. Monitoring of availability and response time of sites has moved past measuring at the backbone to measuring from the user interface. And beyond the lead metrics of page download speed and availability, applications monitoring is tracking the performance of discrete components that affect those metrics.
Within the enterprise, other types of testing, software and services pinpoint and cut resolution time on content delivery errors that show up at the user end. Some online operators are even making the data on how their technology is supporting the user experience do double duty: They’re using the benchmarked data to draft service-level agreements that set contractual standards on site performance. Those standards apply both between a company’s business and IT departments and between the company and its outside technology providers.
Revealing the opportunities
FootSmart.com turned the minus revealed by its last-mile monitoring into a plus–using additional performance monitoring. “We realized we had a huge opportunity in terms of low-hanging fruit to be able to lift our top line,” says Galtere.
It turned out that much of the page weight creating the problem for dial-up users was in new code written into the pages for tracking and reporting purposes. But eliminating too much of that code in an effort to lighten pages would mean losing that functionality. To find a solution, Galtere used Gomez to run a head-to-head test of the performance of web site pages on which code had been compressed against pages served up by an outside content delivery network.
The outside network, Akamai Technologies Inc., ultimately improved performance more than in-house efforts at code compression, he found. The data gave Galtere what he needed to secure funding approval to bring in Akamai, which FootSmart already was using for limited content caching, on an expanded basis.
Within three months of the full-time implementation of last-mile testing from Gomez and content delivery services from Akamai, Footsmart.com’s sales increased 10%. The shopping cart abandonment rate dropped by 7%, home page dial-up downloads went from about 50 seconds to 12 seconds and home page availability on dial-up climbed from 82% to 99.6%. Galtere notes that due to other initiatives such as online marketing campaigns launched during that time, the new technology implementation can’t receive credit for the entire 10% lift, but he estimates it’s responsible for about 3% of the increase.
One factor affecting web site performance and the speed with which customers can call up pages and complete online transactions is the increased complexity of sites. A transaction such as an online purchase may be composed of multiple applications and unlike e-retail’s earlier days, many of those applications may be integrated with or imported from outside providers, points out Pete Cruz, director of product management, enterprise solutions group, at performance applications testing services provider Empirix Inc.
Variety of testing
A purchase may start with a customer log-in, for example, which requires a user authentication application, Cruz notes. Another application serves up the product page the user requests. Adding an item to the shopping cart, going through checkout, requesting credit validation and executing shipping require other applications, some of which may be pulled in via web service calls to outside technology providers. The resulting process represents, Cruz says, “many points of potential failure.”
That’s one reason e-retailers such the Vermont Teddy Bear Co. monitor and test site performance in a variety of ways. For its four e-commerce web sites, Vermont Teddy Bear depends on vendor Alert Site to monitor uptime and load time, and also to monitor performance of a couple of key web site applications. Alerts on any problems go to IT staff’s e-mail in-boxes or pagers. “They’re able to tell you whether there is content on the page served and whether users are getting an error message,” says webmaster Tom Funk.
Such basic monitoring may cost Vermont Teddy Bear in the range of $100 per month, but simulated, scripted load-testing-to-order taps different vendors and may carry a ticket in the range of a few thousand dollars. So Vermont Teddy Bear saves it for special circumstances, such as testing the performance of a site in development, or testing application performance on a new e-commerce platform before going live. In April, for example, the company used the services of hardware and software testing provider KeyLabs to preview how its recently- acquired Calyx and Corolla floral web site would perform when it moved the site from its existing e-commerce platform to a new one-shortly before its busiest holiday, Mother’s Day.