Before developing apps for mobile devices, companies should do their homework with security checks. That warning from experts comes after Google Inc.'s announcement last month that it had removed several potentially dangerous apps that were available for download on the Android Market.
Matt Bishop, a web and mobile security specialist and professor of computer science at the University of California at Davis, says a common way criminals release infected apps is by copying the written program for a legitimate app, adding malware, and then putting the revised app out for purchase with a different look and under a new name. That means mobile operating systems such as Apple Inc.'s iOS or Google's Android need to have rigorous security checks in place to ensure an app is safe before making it available to consumers.
"The app developers themselves usually can't do much about malware, because the malware is added later," Bishop says. One thing they can do, however, is try and keep the app as closed as possible so that hackers never have a chance to copy the app, he adds.
Closed here means keeping to a minimum the degree to which an app opens itself up to web sites or mobile networks. For example, adding Facebook integration to an app opens up a web channel to the social network. Retailers must weigh the value of Facebook integration with the need for security.
"App developers should be sure the app runs with the least amount of privileges it needs to get the job done," Bishop says. "If the app doesn't need to access the cellular network because it doesn't need to use the network, the app should not have the rights to access that network. Think of it like a need-to-know basis. With apps, if the app doesn't need access to a resource file or network, it should not have it."