Maintaining web site performance today means monitoring and managing content and traffic that’s outside a retailer’s control.
There was a time when a consumer’s Internet browser simply displayed data stored on a retailer’s web server. If the page took a long time to load, or didn’t render properly, there were only a few places to look for the problem.
But that’s not the case today, when a typical e-commerce site may be drawing content from several outside servers while being bombarded by requests from scores of search engines and comparison shopping sites.
That means e-retailers have to be aware of and manage a lot of activity that goes on outside of their own data centers. Failure to manage that outside activity can slow e-commerce site performance, a complaint of 22% of web users in a Forrester Research survey last year.
Lots of bots
One issue affecting online retail sites is the stream of requests from search engines, comparison shopping engines and other automated systems that constantly ping sites to keep updated on what products retailers are offering and at what prices. Those bots account for about a third of the traffic for CSN Stores Inc., which operates more than 200 sites that sell housewares, furniture and other items, says Steve Conine, chairman.
“At times more than half of the traffic hitting our sites is from bots, not even from people,” Conine says. “It’s significant because you’ve got to have the infrastructure to handle requests, and some of that is to handle requests from bots.”
CSN’s solution was to set aside a portion of its bandwidth for the automated bots, reserving most of its data-serving capacity for human visitors. “We meter the rate at which we serve content to bots, giving them a smaller straw to suck through, while we give customers the more open pipe to hit,” Conine says.
Most of the bot traffic comes from legitimate automated systems-such as major search engines like Google, Yahoo and Microsoft-and they identify themselves as such in their headers, making it possible to segregate most of the bot traffic, he adds. Hackers also use bots to probe for vulnerabilities in e-commerce and other web sites.
CSN’s fix improved performance for customers, says Conine, who would not quantify the improvement. He measures the experience of consumers on his site using a TrueSight IM monitoring device from Coradiant Inc., which captures site traffic, analyzes it and provides reports on how the site is performing, including by category of users. He says that device cost about $100,000.
CSN is also an early user of a Coradiant technology that addresses another problem related to greater reliance on outside companies by e-commerce sites. CSN, like many retailers, uses a content delivery network, in this case Akamai Inc., to store images and other site content on servers around the world, accelerating delivery of that content to consumers visiting CSN’s sites.
Gaining an edge
The problem is: how does CSN know how quickly a customer sees that content when it is Akamai delivering the content, not CSN’s own server? Coradiant introduced this summer a product called TrueSight Edge that monitors traffic from Akamai servers to consumers. “It gives CSN Stores visibility into web traffic patterns and performance for sites that are offloaded by Akamai, giving us centralized system monitoring and reporting for all traffic data,” says Dan Rowe, infrastructure architect at CSN.
Another way outside companies can affect performance is by placing pixels onto an online retailer’s site. Those pixels may track activity, such as clicks on ads or sales that originated on an affiliate site, or they may be placed by an e-retailer’s analytics vendor or by an outside service that helps a retailer perform A/B tests on its web pages.
Under Armour Inc., a manufacturer of athletic gear and apparel that sells online, discovered through analyzing reports from Gomez Inc., which monitors the performance of the e-commerce site, that some of those pixels were loading slowly, increasing the time it took for visitors to see the complete page.
“At any time there may be eight or 10 pixels being tracked on a page,” says Brian McManus, director of e-commerce at Under Armour. “If a page is loading in six to eight seconds, and two seconds of that is because of a pixel, it can be detrimental to our conversion and the consumer shopping experience.”
To prevent such problems, Under Armour negotiated agreements requiring each company placing pixels on its site to deliver those pixels within specified times. The acceptable time varies on a case by case basis, says Mark Kuhns, vice president of global direct, who says Under Armour’s aim is to be in the top 5% of performance rankings among e-retailers. “We don’t want these technologies to have a cumulative effect that won’t let us be best in class,” Kuhns says.
The service-level agreements have had an impact, McManus says. He says overall site performance was about 33% better by late summer than it was six months earlier.
Many retailers face the problem of site performance suffering because of third-party pixels, says Ben Rushlo, senior manager of web performance at performance-monitoring company Keynote Systems Inc. He advises retailers to generally put pixels at the bottom of the code for a page, even though that means an action might not be tracked if the visitor clicks off a page before it fully loads.
“The user is not coming to the site to get tracked, but to buy something,” Rushlo says. “Let’s not let the analytics or media tracking group run the business.”
Other retailers pick and choose where they put the tracking pixels, for instance, placing the ones they value most-such as analytics tracking codes-higher up on the page, and less important pixels further down. That way retailers can be sure to get the data they need most without slowing down page load times. It’s an example of how retailers are retaining control of web site performance, even when content is coming from the outside.
Click Here for the Guide to Performance Management Products & Services