PETs and Data Sharing in the Financial Sector: Lessons Learned from the 2019 FCA Tech Sprint

I had the pleasure of being invited to participate in the 7th annual Tech Sprint, organised by one of the most innovative regulators in the world: the U.K. Financial Conduct Authority (FCA). Immuta last participated in the 2017 FCA Tech Sprint, and produced a whitepaper you can read here.

This year’s event (29 July – 2 August) gathered 10 teams comprised of a variety of financial institutions, technology providers, lawyers, and anti-money laundering specialists. The teams were supported by “floaters” (experts in their respective areas), and industry regulators from the U.K. Information Commissioner Office, the U.K. National Crime Agency, the U.S. Department of the Treasury, and the U.S. Federal Reserve who observed and judged the competition and made sure the most complex questions were answered. As a representative of Immuta’s Legal Engineering team, I was invited to give a presentation on Privacy by Design, data analytics, and compliant data sharing, and also serve as a “floating” expert throughout the competition.

The objective of this year’s Tech Sprint was to explore the potential of Privacy Enhancing Technologies (PETs) for the sharing of information among financial institutions, regulators, and law enforcement. PETs have received a lot of attention lately. For example, the Royal Society’s recent report on PETs provides a useful overview of the trends in this area, such as trusted execution environments, homomorphic encryption, secure multi-party computation, and Differential Privacy.

Could data sharing be made easier if PETs were implemented by financial institutions, regulators, and/or law enforcement?

This was the main question the groups were asked to tackle. To find the answer, participants were required to 1) consider the difficulties of data sharing within finance, and 2) identify a comprehensive list of controls for compliance with privacy and data protection law, as well as money laundering regulations and criminal finance legislation.

It was a fascinating week. Here are my top three takeaways:

PETs Are Not a Silver Bullet: Privacy Enhancing Technologies must be part of the controls implemented by all stakeholders if data is to be shared within finance. However, it is easy to fall into the trap (what I call the “PET illusion”) and think that by including a PET, such as secure multi-party computation or homomorphic encryption, complex privacy requirements will be immediately satisfied. As I explained in my talk, PETs can help meet security requirements, in particular confidentiality, or data minimisation, and also help drastically reduce the amount of information needed to be shared. What they don’t do is provide a way to meet data privacy requirements in a way that’s compliant with industry regulations such as GDPR (such as lawfulness, transparency, and fairness, which should lead to an assessment of the impact of the decision pipeline), purpose specification (i.e., specifying a legitimate purpose before being able to preserve it), accuracy, and accountability.

This is why a Data Protection by Design (DPbD) workflow should 1) be integrated within an impact assessment, and 2) include at least eight different requirements (as illustrated in this white paper). Data minimisation and security are only two of these requirements.

Addressing concerns of model inversion, for instance, requires strong access controls no matter which PET is applied. As a result, controlling the data environment is still, and will continue to be mandatory – whether or not a PET is implemented.

PETs Must Be Integrated Within a DPbD Workflow: When the limits of PETs started to become obvious to the teams, things got very interesting. Some of the proposed solutions, depending upon their actual implementation, had the potential to lead to less intrusive data sharing and to improve the relevance of the data to be shared.

For example: Section 339ZB of the U.K. Proceeds of Crime Act 2002 introduced a voluntary data sharing strategy between regulated institutions, which may be used by banks before filing a suspicious activity report to the National Crime Agency. Despite its existence, the Act isn’t yet well understood – banks are still reluctant share data across organizations.

The way section 339ZB is worded, it appears to be very much a binary process, which is intended to be triggered by some degree of suspicion: a bank may disclose information to another and request further information if it believes that the bank receiving the request has in its possession suspicious data. If the bank receiving the request believes that data will assist in detecting suspicious activity, it can disclose that data. Yet, in order to increase the level of trust between actors, financial institutions need more than a binary approach – they should be able to implement a gradual process of selective disclosure without initially revealing the customer data collected by the bank receiving the request. Even if the input data (i.e., the customer data collected by the bank receiving the request) is not revealed to the bank sending the request, the output data (i.e., the result to the query) will still be personal data in the hands of the requesting bank determined to identify suspicious activities. In other words, the power of PETs can only be fully applied once they’re nested within a Data Protection by Design workflow.

Legal and Technical Expertise Must Be Combined: Providers of technology solutions must be able to engage with the nuances of legal requirements, and vice versa. Throughout the week there were several instances in which technical personnel looked to the legal experts for clear guidance, but the legal experts didn’t understand the technical requirements. As I just wrote about for Dark Reading, and as recently covered by Forbes and Fast Company, this requires Legal Engineering skills — a blend of technical and legal expertise — that traditional lawyers do not always have and technology providers do not always expect, and therefore don’t necessarily know how to ask for. In fact, legal assessment is usually conceived as an add-on, bolted on after the technical idea has already been conceived or developed. This relatively rigid division of tasks explains why several of the teams at Tech Sprint had to go back to the drawing board after the first official feedback session.


If you’re interested in learning more about the FCA Tech Sprint, visit:

To learn more about how technical, financial, legal, and compliance roles can closely collaborate to produce effective compliance strategies within your organization, download these two useful white papers:

Privacy by Design: The Key to Unlock Your Compliance Strategy
Immuta & Data Protection by Design: Making the GDPR Work for Your Data Analytics Environments