One company says testing has helped increase time spent in its app by 22%.
Will app users like a home screen featuring new products or would they be more apt to purchase from an app that welcomes them with a list of items on sale? Will the Add to Cart button in an app convert best in green or red? Now app operators, including online retailers, have a new tool to conduct such A/B tests, long popular with e-retail sites, to improve their app performance.
Launched today, Artisan Optimize enables marketers, retailers and publishers to test and modify iPhone and iPad mobile apps. The service enables companies to conduct A/B tests on various aspects of their apps by delivering different app experiences, such as changes in calls to action or images, to groups of consumers and then seeing which version of the app performs best.
Using an Artisan feature called Test Designer, app operators can swap out or modify images, text and layouts and start A/B tests. Artisan Optimize then enables the companies to collect data for each experiment, including number of users and sessions, the percentage split of consumers who saw each feature, and the conversion rate for each group.
Companies implement the testing service through a software development kit, or SDK, that integrates with the app, the vendor says. They can also target tests by type and iteration of device. For example, they can set up one test for consumers using an app on an iPhone 4 and another for those hailing from an iPad 2. The service for now only works with Apple devices, but the vendor says it plans to extend the offering to devices running Google Inc.’s Android operating system soon. The fee for the service ranges from $1,000 per month for one app, 25 tests and 25,000 monthly users to $10,000 per month for 100 apps, 5 million monthly users and an unlimited number of tests.
The rollout of Artisan Optimize follows a private beta launched in September and public beta announced in December. More than 100 companies have tried the A/B app testing service, Artisan says.
One of those companies is A View From My Seat, which operates a web site and mobile app that enable visitors to browse consumer-submitted photos displaying the view from various audience sections at sports and entertainment venues around the world. Consumers capture and upload photos from events, which the company then categorizes and shares across its advertising-supported web site and through its app.
While A View From My Seat understood consumer behavior on its web site, it had very little insight into how consumers used its mobile apps, says CEO Frank Panko.
Because A View From My Seat relies on user photos, it places a high priority on making the uploading and sharing process as easy as possible, he adds. For most consumers that means being able to shoot an image from a smartphone and upload it to the web from the same device. While the company’s web site is a popular place for browsing photos, its mobile app offers the best channel for sharing images, Panko says.
However, A View From My Seat was having trouble understanding if uploading and sharing via the app was easy for consumers to figure out. On the web site, the company could see how many visitors returned on a regular basis, which venues were most popular, what times of year were most active, and more. Through the mobile apps, however, the company could only see total number of downloads and any feedback posted to the app stores, the company says.
A View From My Seat deployed Artisan Optimize last year and says the average amount of time spent in the app is about 22% higher than before because of using A/B testing. Additionally, photo submissions increased by a range of 3% to 5% as of November, compared with before it began using Artisan.
The company now runs app A/B tests continuously, using the information it gains to determine the features that are most valuable and how to increase app engagement. Often, the company runs several tests concurrently, evaluating text, images and layout across multiple screens, it says. It measures such things as social sharing, search activity and the effectiveness of different tab labels.
For example, during A/B testing, A View From My Seat found that starting users on the “share” tab results in more photo uploads and that including images doesn’t always lead to better results—sometimes simple text is more effective.
The company is currently using the tests to determine the best way to greet app users. When A View From My Seat runs a test using Artisan Optimize, one group might open the app to see a home screen with the image of a popular venue, while another might find the home screen showing a list of stadiums and arenas with links to seating sections. After analyzing the results of a test, A View From My Seat can finalize the layout version with the best results and push it out to all of its users.
Another benefit of the service is that the company doesn’t need to recompile code or resubmit a new version of the app to the app store for approval and try to get users to download it when it makes changes, Panko says. “It takes one to two weeks after submitting an update to Apple before an app makes it to the store and becomes available to users,” Panko says. “We want to be able to enhance the iOS experience now. If you had to re-submit the app after tests and changes, and you consider development time, this would be really time-consuming. Because we don’t have to resubmit, we’re running more tests, making more changes. The design look and feel is better. Tests are telling us what’s right and wrong. We can send updates over the air right away; flip the switch, and make it permanent.”