The past few months have been particularly hectic for lawmakers across the European Union (EU). With Ursula von der Leyen’s leadership of the European Commission set to conclude after the 2024 elections, lawmakers have felt the pressure to advance critical files and policies as quickly as possible.
Amid this legal shuffle, there are three especially important EU-based data regulations that are worth tracking, as they will influence data practices across a range of critical sectors both in the EU and around the world. These laws include the AI Act, the Data Act, and the Health Data Space Regulation. In this blog, we’ll review these emerging regulations, and share what each will mean for how your organization collects, processes, and uses data resources.
Data Security Law #1: The AI Act
The continuing global surge of artificial intelligence advancements has been met with equal parts user excitement and legal contemplation. Governments around the world are being forced to reckon with the legal ramifications of emerging AI technologies, seeking to balance innovation with necessary security and privacy standards.
In the beginning of December 2023, a political agreement was reached between the European Parliament and the Council on the Artificial Intelligence Act (AI Act). While the details of the act will be further discussed and refined in a series of technical meetings, its overall structure and ambition are already clear (see the European Commission’s FAQs here).
In essence, the act will apply to both public and private entities – inside or outside the EU – each time an AI system is placed on the Union market or is used in a way that affects people located in the EU.
Obligations are imposed upon both the providers and deployers of high-risk AI systems. Additional obligations are also imposed on providers of general-purpose AI models, including large-scale generative AI models such as OpenAI’s ChatGPT. Providers of free and open-source models are exempted from most of these obligations, except when their general-purpose AI models are responsible for systemic risks – risks that would potentially have a significant detrimental impact on the health, safety, and fundamental rights of individuals.
The EU AI Act is presented as the world’s first comprehensive AI law by EU institutions. This follows regulations rolled out by China starting in 2021, as well as an executive order on AI issued by President Biden at the end of October 2023.
Once the technical details of the act are ironed out, the political agreement will be followed by a formal approval by the European Parliament and the European Council, and the act will enter into force 20 days after publication in the Official Journal.
Data Security Law #2: The Data Act
Although the Data Act has garnered less attention than the AI Act, it is still being heralded as a game changer for the future of the single market for data envisioned and sought after by many EU lawmakers. The Data Act is more advanced than the AI Act, as it was published in the official journal just before Christmas.
One core goal of the Data Act is to make sure that “connected products [understand IoT devices] are designed and manufactured … in such a manner that product data [understand usage data]… are always easily and securely accessible to a user, free of charge, in a comprehensive, structured, commonly used, and machine-readable format” (see Article 3(1)). This means that users of IoT (internet of things) devices are granted a right to access readily-available usage data – be it personal or non-personal – and holders of this usage data are under an obligation to share it with users, and under certain conditions third parties.
This requirement will impact a range of different sectors and devices, including private, civil or commercial infrastructures, vehicles, health and lifestyle equipment, ships, aircraft, home equipment and consumer goods, medical and health devices, and agricultural and industrial machinery, all of which are listed in Recital 14 of the act.
In practical terms, data holders will need to classify usage data – distinguishing personal from non-personal data, personal data relating to the user from that relating to other data subjects – identify trade secrets “prior to the disclosure…on necessary measures to preserve their confidentiality,” and build secure data exchange channels or environments.
Purpose-based prohibitions are also envisaged and relevant for non-personal data: Article 4(13), for example, makes it clear that the holder of the usage data should not use non-personal data to derive insights about the user’s economic situation, or its assets or production methods, nor should they undermine the commercial position of that user on the markets in which it is active.
Data Security Law #3: The Health Data Space Regulation
In its European Data Strategy of 2020, the European Commission envisioned building common European data spaces to ensure “that more data becomes available for use in the economy, society and research, while keeping the companies and individuals who generate the data in control.”
This led to the development of data spaces in strategic economic sectors and domains of public interest, and in 2022 funded a Data Spaces Support Centre (DSSC) set up to “contribute to the creation of common data spaces, that collectively create a data sovereign…to enable data reuse within and across sectors…supporting the European economy and society.”
The first legislative proposal embodying such a vision is the European Health Data Space Regulation, released by the European Commission in May 2022 and amended by the European Parliament (EP) on December 13, 2023.
The Commission expects the European health data space to promote safe exchange of patient data for both primary and secondary uses, including research and innovation. The proposal thus aims to incentivize health data holders to make different categories of electronic health data available for secondary use and data sharing, provided that data security is achieved.
The categories of electronic health data that can be processed for secondary use are broadly defined, and include:
- Relevant data from the health system, including electronic health records, claims data, disease registries, and genomic data.
- Data with an impact on health, including consumption of different substances, socio-economic status, and behavior.
- Environmental factors, including pollution, radiation, and the use of certain chemical substances.
- Automatically generated data from medical devices.
- Person-generated data such as wellness applications.
This proposal envisions setting up health data access bodies with the aim of granting permits to data users. These permits are conceived as a means to enable data users to process the electronic health data specified in the data permit for the secondary use purposes also specified in the data permit.
The use of anonymization tools is strongly encouraged (Article 44). Data users are obligated not to re-identify the electronic health data provided to them in pseudonymized format. Furthermore, failing to respect the health data access body’s measures that aim to ensure pseudonymization will lead to penalties. All data users are required to implement robust data security measures and describe them within their applications (Articles 44 and 45).
Preparing for New EU Data Security Laws
In order to adhere to these evolving EU regulations, organizations must have the capacity to dynamically update and refine their data security and privacy policies, as well as data access and sharing methods for data users. This necessitates the use of data security platforms with policies that can be dynamically adjusted and applied to an organization’s data users to maintain consistent, informed compliance.
To learn more about kicking off your comprehensive data protection journey, check out our Data Classification 101 white paper.
Data Classification 101
Learn how to kick off your data protection journey with data classification.Read More