Today, Apple’s Tim Cook critiqued the data-industrial complex, and called for GDPR-like regulations to be tempered with serious thought about how to design those regulations in a way that prevents data monopolies.
As the privacy frenzy grows, regulators will be driven to ‘do something’ and could be tempted to create and implement generic regulations rather than specific prescriptions — because it’s easier to write general rules than it is to create explicit recommendations. However, this approach could result in overly broad regulations that will ensure only the most well-resourced companies will be able not only to comply — but also, as the London School of Economics shows — to lobby for self-serving changes, potentially resulting in the very ‘data monopolies’ that Cook critiques.
Instead, we recommend more specific regulations should:
- Be tiered based on the resources of the firm — such as the tiered approach seen in the Basel II/Capital Requirements Directive.
- Incentivize the adoption of industry best practices for data protection by design, such as privacy-enhancing technologies that make it cheaper for organizations to adopt differential privacy (which Apple uses) and homomorphic encryption. These tools help organizations better balance the tradeoff between the usability of data and consumer privacy and security.
- Encourage the sharing and pooling of data to encourage competition against data monopolies.
Aside from specificity, regulators need to focus efforts on how data is used – not just on how data is collected. We can’t rely on upfront consent. In the age of the Internet of Things and machine learning, there is just too much data and too many unknowable consequences of that collected data. As a result, we must also give data subjects more control. Just like GDPR, U.S. regulations should also give data subjects opportunities to alter their data and its processing after they consent to collection. Furthermore, focusing on how data is used can help prevent data from being weaponized. Concepts like purposed-based restrictions enforced at the algorithm level and differential privacy need serious consideration in any future regulation.”
Tim’s comments couldn’t be timelier.
Just yesterday, Harvard Business Review published a piece written by Andrew Burt, Immuta’s Chief Privacy Officer, about “Why Privacy Regulations Don’t Always Do What They’re Meant To.”
We’ll be sure to update you on the evolution of this topic.