App Privacy Concerns and ASO

July 19, 2019

/assets/img/InternalBlogGraphic_Blank.png

Over the past few days, social media quickly filled with posts from an app called FaceApp, which uses AI to edit photos of users faces. Just as quickly, users began raising privacy concerns with the app, such as cloud photo storage and unauthorized access to the user’s photo library. Whether or not these fears are founded, it provides a solid example of why apps should be upfront about privacy and security.

FaceApp

FaceApp is not a new app, but recent updates – wherein users can edit their faces to look older – have brought it back into popularity. As it is an app that can access user photos, there is concern about privacy and security. The Verge notes that there’s no evidence that the app downloads a user’s photo roll, although it does upload single images to apply filters on the app’s server before the device downloads it again.

According to TechCrunch, there are also concerns about the app accessing the photo library without permission. However, an Apple API introduced in iOS 11 lets users select single photos from a system dialogue the app can work on without being given full access to the photo library; the app can only access the selected photo unless it’s given permission for more.

Users can request to remove photos from the server after they’re done editing, although it requires sending a specific request through the support menu.

While FaceApp states that it doesn’t sell user data to third parties, the privacy policy still uses “broad language that allows it to use people’s usernames, names, and likeness for commercial purposes,” which is not GDPR-compliant.

FaceApp released a statement responding to privacy concerns, but the U.S. Senate is already calling for investigations into the app.

App Store Security Policies

Apple takes security very seriously, as the App Store Review Guidelines frequently touch on what data apps can and cannot use for what purposes. According to the Privacy guidelines:

“Apps should only request access to data relevant to the core functionality of the app and should only collect and use data that is required to accomplish the relevant task. Where possible, use the out-of-process picker or a share sheet rather than requesting full access to protected resources like Photos or Contacts.”

Additionally, the guidelines state:

“Unless otherwise permitted by law, you may not use, transmit, or share someone’s personal data without first obtaining their permission. You must provide access to information about how and where the data will be used.”

Google Play has similar policies. The Play Store’s privacy guidelines say that apps handling sensitive user data must:

"- Limit your collection and use of this data to purposes directly related to providing and improving the features of the app (e.g. user anticipated functionality that is documented and promoted in the app's description).

- Post a privacy policy in both the designated field in the Play Console and within the app itself. The privacy policy must, together with any in-app disclosures, comprehensively disclose how your app collects, uses, and shares user data. Your privacy policy must disclose the type of parties to which any personal or sensitive user data is shared.

- Handle all personal or sensitive user data securely, including transmitting it using modern cryptography (for example, over HTTPS)."

If an app is in violation of the policies on either store, it will be removed. Given the concerns raised around FaceApp, it is likely that the app will be placed under intense scrutiny to ensure that it is in full compliance with any and all App Store and Play Store guidelines.

Privacy and ASO

The case around FaceApp serves as a stark reminder of the importance of apps being upfront about privacy and security. The app stores require developers to disclose their privacy policies in their apps, although that can be as simple as including a link to the privacy policy in the description.

For App Store Optimization purposes, being upfront about security can be essential for maintaining a good reputation. Users are often worried about how their data is being used, and the more popular an app becomes, the more concerns may be raised. One of the last things an app developer wants is a flood of user complaints stating the app store and sold their personal data. Being transparent about what is collected and how it’s collected can help with this. If it’s made clear to users what the process is, it shows to users that the developer is keeping them informed and has their privacy in mind.

In this case, being upfront with the security may have helped prevent the app from receiving national attention. Any app that wants to edit photos must first request permission – explaining to users why this is necessary and how it works can put their minds at ease.

Not being upfront about data usage can also lead to an app’s rejection from the store. In cases like this, where an app is drawing enough attention to warrant government investigation, it’s especially essential to remain compliant.

If an app is not compliant with the store’s policy, it will likely be removed. Should that happen, all the keyword rankings and positioning that the app accumulated will be lost. Even if it’s restored to the store, the app will have to begin indexing and ranking for keywords from the beginning, making it harder to find while its competitors continue to grow.

Strong app security and visible privacy policies are not just recommended for an app’s reputation, they’re essential. While it is yet to be seen what will happen with FaceApp, being compliant to the App Store and Play Store policies – as well as GDPR policies – is a must. Informing users how their information can be gathered and used builds trust and can prevent a surge of concerns.

Want more information regarding App Store Optimization? Contact Gummicube and we’ll help get your strategy started.

Thanks

CONTACT US!


Fill out the information below and our team will get back to you right away.
@2021 Gummicube All rights Reserved