OWASP Data storage & privacy standards

Use MASVS to secure your data as an Android developer


The protection of data is core to secure app development as your users have an expectation that data any data captured or provided is kept safe. On mobile platforms, there are risks to data privacy from improper coding — such as storing data in a location that is accessible to other applications — or via ‘leakage’ to malicious actors & applications, for example using insecure IPC that leads to malicious applications being able to request files that should be secret.

In this article, we’ll talk through the requirements for storing data from the mobile application verification standard, and demonstrate how we can ensure applications store data securely.


Found out more about MASVS here

The mobile application security verification standard (MASVS) lays out the key controls to meet the secure storage

L1 (all application) requirements

  • System credential storage facilities need to be used to store sensitive data, such as PII, user credentials or cryptographic keys

On Android — this means the keystore. Storing keys here means that your encryption keys are stored in a hardware backed encryption module (where available) of the device; when correctly using keystore, key material never enters the application memory protecting against all but the most extreme types of attacks.

While using the API directly is possible, a far easier way of ensuring your data is properly secured on the device is using the Android security library.

Android Developers The Security library provides an implementation of the security best practices related to reading and writing data at…developer.android.com

Using this library, we can use EncryptedSharedPreferences to store tokens and other pieces of information, and use EncryptedFile to store files — both using keystore to handle the encryption keys.

  • No sensitive data should be stored outside of the app container or system credential storage facilities.

While this is more a case of not doing something, it can be more difficult than you think. We’ll cover this more in a later article, but files should generally always be stored inside the application sandbox, you can make sure of this using safe to run, for example:

val isFileSafeToOpen = file.verifyFile(this) {        
  • No sensitive data is written to application logs.

This should be fairly easy to solve by removing all application logs at runtime with Proguard.

release {    minifyEnabled true    proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro'}
-assumenosideeffects class android.util.Log {    public static *** d(...);    public static *** v(...);    public static *** i(...);    public static *** w(...);    public static *** e(...);}

In addition to this, some applications log data and stream remotely for debugging and monitoring — this is challenging to guard against as identifying ‘sensitive data’ is difficult to automate, and therefore requires careful of review of code.

  • No sensitive data is shared with third parties unless it is a necessary part of the architecture.

For this control, it is particular it’s important to be aware of notifications — any data in notifications could be read by 3rd party applications that have the read notifications permission.

In addition, it is important to be aware of which 3rd party libraries you are importing into your application and whether or not they are collecting information. This is a difficult thing to track, but bear in mind that any library you add to the application will be executing inside the applications sandbox and has full access to any files and data you generate.

  • The keyboard cache is disabled on text inputs that process sensitive data

When users type in input fields, the software automatically suggests data. This feature can be very useful for messaging apps. However, the keyboard cache may disclose sensitive information when the user selects an input field that takes this type of information.

The most straightforward way of avoiding this in XML Android is to use:


Or in jetpack compose:

TextField(    value = text,    keyboardOptions = KeyboardOptions(        keyboardType = KeyboardType.Email,        autoCorrect = false    ),    onValueChange = {        text = it    })
  • No sensitive data is exposed via IPC mechanisms.

This part comes in two strands — first of all, ensure that any intents or URIs you accept don’t inadvertently lead to some kind of execution or file leakage. For instance, check this vulnerability in the brave android application:

Brave Software disclosed on HackerOne: Cookie steal through content… Write-up available [here](https://infosecwriteups.com/brave-stealing-your-cookies-remotely-1e09d1184675).hackerone.com

Secondly ensure that content providers and file providers are sufficiently secured — in particular ensure that you are only exposing files through intent providers that you want to. For example, check this common vulnerability:

<paths xmlns:android="http://schemas.android.com/apk/res/android"> 
    <files-path name="root" path="/"/>

Here, we will have exposed all files in our private directory — including, databases and shared preferences.

In both cases, the key is to verify intents, from wherever they come — a tool like Safe to run can make this a relatively easy implementation which is secure by default:

intent.verify {    
    actionOnSuccess = {
                            // Do something   
    actionOnFailure = {        // Report failure    
  • No sensitive data, such as passwords or pins, is exposed through the user interface.

Similar to disabling the keyboard cache, ensure that sensitive data is not visible on the screen:

  • The app educates the user about the types of personally identifiable information processed, as well as security best practices the user should follow in using the app.

This is a huge opportunity for improvement in almost every application — if you are capturing a users personal data, ensure that you are regularly educating users about how they can use your application securely to protect that data. Users are often the weakest link in the security chain and educating them about best practices can go a long way to protecting users.

L2 (Sensitive app) requirements

  • No sensitive data is included in backups generated by the mobile operating system.

Android offers a number of tools to allow backups of applications, backup over USB (using adb backup), APIs that developers can use and google’s “back up my data” feature.

Be aware, that when backing up application data in this way — you lose a lot of control of it. One simple step is to turn off backups by default which can be done in the Android manifest:


If you’re going to allow backups, ensure you understand the data being backed up automatically is appropriate for your application:

Back up user data with Auto Backup | Android Developers Auto Backup for Apps automatically backs up a user's data from apps that target and run on Android 6.0 (API level 23)…developer.android.com

  • The app removes sensitive data from views when moved to the background.

When an app moves into the background, a screenshot of the application is taken and displayed in the view — this is a reall simple thing to disable:

window.setFlags(WindowManager.LayoutParams.FLAG_SECURE,                WindowManager.LayoutParams.FLAG_SECURE)
  • The app does not hold sensitive data in memory longer than necessary, and memory is cleared explicitly after use.

This is an incredibly time consuming process, in un-managed code (e.g. c++) the key is to null all memory before freeing it — and do this as often as possible.

For managed (e.g. kotlin and java) it is more difficult, as the garbage collection process is done for you — some tips to avoid sensitive data being held in memory is to ensure that any ‘singleton’ objects (even singletons defined in dependency injection) do not hold sensitive data; load data into a class, use it, and then make sure that class is no longer used and is free to be collected by the garbage collector

  • The app enforces a minimum device-access-security policy, such as requiring the user to set a device passcode.

Checks like this are a key part of the Safe to run — resilience library.

private inline fun canIRun(actionOnFailure: () -> Unit) {    
    if (safeToRun(
    ).not() { actionOnFailure() }
  • No sensitive data should be stored locally on the mobile device. Instead, data should be retrieved from a remote endpoint when needed and only be kept in memory.

Naturally, this is simply a case of not storing data — in most cases this is unacceptable for user experience however and a pragmatic decision should be made on whether or not to store data on the device.

  • If sensitive data is still required to be stored locally, it should be encrypted using a key derived from hardware backed storage which requires authentication.

In addition to the android security library and use of encrypted shared preferenes and encrypted file mentioned already, another mechanism for encrypting your data is to use SQLCipher: https://www.zetetic.net/sqlcipher/

It has hooks for many popular SQL libraries including SQLDelight and Room Database

  • The app’s local storage should be wiped after an excessive number of failed authentication attempts.

Whether implementing any local authentication (e.g. PIN screens) or remotely authenticating — it is important to protect against an automated attack on the device, first with exponential back-offs and then by wiping data.


Data is a huge part of app development, and one of the things that users are most concerned with remaining secure — MASVS provides a framework to follow which is extensive and, if followed, provides a great base-level of security to your application.

Thanks for reading! For more information or to connect:

Join us on slack

Further reading


Last updated