The rush of excitement was real. Nine other legal interns and I hastily signed the required release form before being whisked into the legal office of a Silicon Valley tech giant. Soon, some of the world’s top lawyers would share their hard-won wisdom with us. Who knew, we might even make career-defining connections that day.
Then the partner in charge asked us, “Who read the release they signed?"
Silence. Read the release? With the shining brilliance of our lifelong dreams beckoning from just beyond the door? Nobody read the release.
This is what’s known as the myth of informed consent. And it’s one example of how society’s relationship with law is due for an upgrade. It also shows why lawyers and technology leaders must work together to improve the clarity and readability of privacy policies.
Apple made a start, rolling out its “App Privacy Details” program recently. Apple designed its “privacy nutrition labels,” as they’re more widely known, to help users better understand an app’s data usage practices before they download it. Developers for each of the more than 1.8 million apps on the App Store must follow strict privacy guidelines and report how an app uses the data it collects.
Privacy labels provide easy-to-read summaries that show how an app collects and uses data and whether it tracks an individual. The language is plain and straightforward. Bold headings announce categories such as “Data Used to Track You” and “Data Linked to You,” followed by a succinct description of the heading’s meaning and a list of the data types collected such as “Contact Info,” “Purchases,” and “Location.”
It’s a clear and direct presentation that improves comprehension by leaps and bounds. Users see exactly what they need to know to decide whether the services they get from the app are worth the trade-off in data privacy.
Apple once again heralds a new trend. This time, it stems from a larger effort to create policies that serve consumers. The goal is to make legal information more transparent and legal services more accessible to everyone. Apple has made a fantastic start, but we still have a long road ahead.
That’s why I’m calling for the defenders of user privacy — the market leaders who want to make law more accessible — to adopt the following three practices within the next five years.
Share user testing results and feedback to help everyone learn and improve.
As more privacy labels roll out to more users, we’ll start to discover aspects that work well and those we need to change. Important questions include the following.
Do developers understand how to accurately provide the correct information regarding the collected?
An app developer can be more like a construction site foreman than an architect. They know how to construct the building. But they don’t necessarily understand the intricacies of the entire building’s structural flow and design.
Similarly, developers may not have a complete understanding of how data flows in and out of an app or how it manages data. At times, developers create new apps on top of existing open source code, without the awareness of whether that code tracks or collects data.
Will developers tell the complete truth?
Privacy labels give developers a great opportunity to boast about how little data an app collects. Could that prove too alluring for bad characters? Even not-so-nefarious actors may not find the motivation to seek and gather all the information they need to report.
Will consumers consider privacy labels valuable?
People may stop downloading and using apps that collect what they believe is too much or inappropriate data. Or, they may ignore the new privacy labels entirely, much like they do other policies they “agree” to when installing new software.
Answers to all these questions and more will help technology and legal teams fine tune their approaches and develop useful standards for privacy labels — that is, assuming the information is shared. Making the user testing data and feedback available to everyone serves the greater whole.
Develop an open-source visual framework and common vocabulary for discussing data choices.
As lawyers and technologists continue to develop more privacy labels and policies, they should adopt a common vocabulary and a consistent visual presentation to display privacy options to the public. A common vocabulary builds shared comprehension.
Standard visual icons (e.g., the recycle symbol) become universally recognizable at a glance. Adopt a system of standard icons to depict data usage practices and make them available under a liberal open source license such as Creative Commons for others to use freely.
Familiarity with language and imagery helps people recognize and understand what’s at stake. Increasing both end-users’ and developers’ comprehension brings about meaningful and intentional consent to data choices.
Create a publicly available app comparison tool.
To have truly informed choices, data collection practices need to be benchmarked against peers, and the findings need to be publicly available. People now typically buy apps based on price and features. But future buyers will want to also compare the data collection practices of apps as well as easily access context-based recommendations.
A publicly available “recommendation index” that ranks apps, rewarding minimal and cautious use of personal data and penalizing reckless overcollection, would be an extremely useful resource. Using standardized language and icons in privacy labels would aid in developing automated comparison charts, especially for peer apps that achieve similar goals (i.e., charting a list of smart home apps to indicate which types of data each collects).
The tool should also explicitly describe the purpose for data collection. We’re so often left wondering why an app needs to collect the data it asks for. It makes sense for an exercise app to want to know your birth date. But why does Facebook want to collect data on where you are every time you log in and what device you're using? Would knowing the answer make a difference to you?
I have no doubt that we’ll be using an automated app comparison tool in the near future. Developing standard rating and communication systems and sharing information to help all companies serve the public better are important ways the business of law is evolving to become more consumer friendly. The increased use of analytics, virtualization, and automation in the legal industry drives much of our progress.
Finally, a growing number of people will more easily access the law that was always meant to serve them. And defenders of user privacy are leading the charge.