Let’s be real: Nobody does that.
So late last year, Apple introduced a new requirement for all software developers that publish apps through its App Store.
Apps must now include so-called privacy labels, which list the types of data being collected in an easily scannable format. The labels resemble a nutrition marker on food packaging.
These labels, which began appearing in the App Store in December, are the latest attempt by tech designers to make data security and digital privacy, which are linked, easier for all of us to understand.
You might be familiar with earlier iterations, like the padlock symbol in a web browser. A locked padlock tells us that a website is more secure, while an unlocked one suggests that a website can be more susceptible to attack.
The question is whether Apple’s new labels will influence the choices people make. “After they read it or look at it, does it change how they use the app or stop them from downloading the app?” asked Stephanie Nguyen, a research scientist who has studied user experience design and data privacy.
We learned plenty. The privacy labels showed that apps that appear identical in function can vastly differ in how they handle our information. We also found that lots of data gathering is happening when you least expect it, including inside products you pay for.
But while the labels were often illuminating, they sometimes created more confusion.
How to Read Apple’s Privacy Labels
To find the new labels, iPhone and iPad users with the latest operating system (iOS and iPadOS 14.3) can open the App Store and search for an app. Inside the app’s description, look for “App Privacy.” That’s where a box appears with the label.
Apple has divided the privacy label into three categories so we can get a full picture of the kinds of information that an app collects. They are:
- Data used to track you. This information is used to follow your activities across apps and websites. For example, your email address can help identify that you were also the person using another app where you entered the same email address.
- Data linked to you: This information is tied to your identity, such as your purchase history or contact information. Using this data, a music app can see that your account bought a certain song.
- Data not linked to you: This information is not directly tied to you or your account. A mapping app might collect data from motion sensors to provide turn-by-turn directions for everyone, for instance. It doesn’t save that information in your account.
Now let’s see what these labels revealed about specific apps.
WhatsApp vs. Signal
Both offer encrypted messaging, which scramble your messages so only the recipient can decipher them. Both also rely on your phone number to create an account and receive messages.
But their privacy labels immediately reveal how different they are under the hood. The first one below is for WhatsApp. The next one is the one for Signal.
The labels immediately made it clear that WhatsApp taps far more of our data than Signal does.
For group chats, the WhatsApp privacy label showed that the app has access to user content, which includes group chat names and group profile photos.
Signal, which does not do this, say it had designed a complex group chat system that encrypts the contents of a conversation, including the people participating in the chat and their avatars.
For people’s contacts, the WhatsApp privacy label showed that the app can get access to our contacts list; Signal does not.
With WhatsApp, you have the option to upload your address book to the company’s servers so it can help you find your friends and family who are also using the app.
But on Signal, the contacts list is stored on your phone, and the company cannot tap it.
“In some instances, it’s more difficult to not collect data,” Moxie Marlinspike, the founder of Signal, said. “We have gone to greater lengths to design and build technology that doesn’t have access.”
A WhatsApp spokeswoman referred to the company’s website explaining its privacy label. The website said WhatsApp could gain access to user content to prevent abuse and to bar people who might have violated laws.
Spotify vs. Apple Music
Finally, lets compared the privacy labels for two streaming music apps: Spotify and Apple Music. This experiment unfortunately took me down a rabbit hole of confusion.
Just look at the labels. First is the one for Spotify. Next is the one for Apple Music.
These look different from the other labels featured in this article because they are just previews — Spotify’s label was so long that we could not display the entirety of it 🙄
When we dug into the labels, both contained such confusing or misleading terminology that we could not immediately connect the dots on what our data was used for.
One piece of jargon in Spotify’s label was that it collected people’s “coarse location” for advertising. What does that mean?
Spotify said this applied to people with free accounts who received ads. The app pulls device information to get approximate locations so it can play ads relevant to where those users are. But most people are unlikely to comprehend this from reading the label.
Apple Music’s privacy label suggested that it linked data to you for advertising purposes — even though the app doesn’t show or play ads.
Only on Apple’s website did we find out that Apple Music looks at what you listen to so it can provide information about upcoming releases and new artists who are relevant to your interests.
The privacy labels are especially confusing when it comes to Apple’s own apps. That’s because while some Apple apps appeared in the App Store with privacy labels, others did not.
Apple said only some of its apps, like FaceTime, Mail and Apple Maps, could be deleted and downloaded again in the App Store, so those can be found there with privacy labels.
But its Phone and Messages apps cannot be deleted from devices and so do not have privacy labels in the App Store. Instead, the privacy labels for those apps are in hard-to-find support documents.
The result is that the data practices of Apple’s apps are less upfront. If Apple wants to lead the privacy conversation, it can set a better example by making language clearer — and its labelling program less self-serving.
Ms Nguyen, the researcher, said a lot had to happen for the privacy labels to succeed. Other than behavioural change, she said, companies have to be honest about describing their data collection. Most importantly, people have to be able to understand the information.