What Does 2023 Hold for Privacy in the EU?

What does 2023 mean for privacy?

The year 2023 will bring many changes to privacy in the EU, with several regulations either entering fully into force or gradually preparing businesses for changes in the upcoming years. Europe’s main focus this year considers technology and artificial intelligence, the general digital environment, including communication services and platforms, and healthcare. We summed up the important changes and points that will lead to changes in the privacy landscape in 2023:

AI Act (Artificial Intelligence Act)

The Artificial Intelligence Act (AI Act) was first introduced in April 2021 by the European Commission. Considering the complexity of the issue, even though the Act is planned for voting in the first quarter of 2023, there might be a delay in the voting, to ensure everything is covered, when the Act enters fully into force.

The Artificial Intelligence Act (AI Act) concerns organizations that are operating within the EU and are using AI technologies in their business activities. In that way, Europe ensures that AI systems in the European market respect the Union’s values and laws regarding privacy. Moreover, the rules regarding the development, use, and placing on the market of AI products and services will be harmonized throughout the EU. Not only companies in the technology sector are affected though. Organizations in other sectors (healthcare, employment, or finance to name a few) that use AI technology in their business activities are also affected.

The AI Act will be the first act that will regulate artificial intelligence globally and not only within the EU borders. This means that it does not only apply to businesses based inside the EU. On the contrary, it refers to any organization that has customers or users in the European Member States. The Act aims to turn AI operations into a global standard and not just an EU privacy law. The Act will not replace but will overlap with GDPR, with a focus on specific issues that are not covered at this point.

The Act distinguishes AI applications into four risk categories: unacceptable risk, high risk, limited risk, and minimal or no risk. The European Commission defines the above risk categories as follows:

  • Unacceptable risk”: the AI applications that are a clear danger to the safety of individuals and thus, will be banned.
  • High Risk”: AI systems used in several social sectors, for example, transport, education, or employment, and can be harmful to individuals. The use of remote biometric data in publicly accessible places also falls in that category.
  • Limited Risk”: AI applications with specific transparency obligations. In that case, users are aware that they interact with a technology product and shall have the choice to use the product or not (e.g. Chatbots).
  • Minimal or no risk”: technologies and applications, that do not have a risk to harm the users, such as video games.

But what exactly does this mean for businesses using AI?

Businesses are required to evaluate the impact of their operations, keep records of their processing activities, and be transparent regarding the processing of their customers’/ users’ personal and sensitive data. More specific to the technology sector, AI businesses must not develop systems or practices that can manipulate the users’ behavior or have as a consequence mental or physical harm. Moreover, they must not develop practices that can harm or exploit vulnerable groups, for example, children, or individuals with mental or physical disabilities. As mentioned before, biometric data are also in the scope of the AI Act, by prohibiting systems that process real-time biometric data in public spaces.


Data Governance Act (DGA)

The Data Governance Act (DGA) was first introduced by the Council in the last quarter of 2020. After a few years of negotiations, on June 13, 2022, the Act entered into force. However, there will be a grace period of 15 months from the moment the Act entered into force until becoming fully applicable, on September 24, 2023.

The initiative aims to make more data available throughout Europe and open the way for data sharing across several sectors (for example, mobility, healthcare, environment, or public administration data) and different EU member states.

The regulation focuses on 4 main points, that will facilitate the development of data-sharing systems:

  1. Specific conditions are set for the re-use of specific categories of data, that are held by public sector organizations within the EU (for example healthcare data will be reused for research purposes).
  2. The regulation will act as a framework to ensure that data intermediaries are trustworthy regarding the provision of data-sharing services.
  3. It will be easier for businesses and citizens to make their data available for the benefit of society.
  4. Data sharing is facilitated to make sure data can be transferred and used for the right purposes throughout the EU.


In addition to the existing regulations (cf. GDPR) that aim to protect people’s fundamental’s rights, DGA’s primary goal is to foster the emergence of a European data-driven economy. The Act will benefit the EU by bringing data-driven innovation, new job opportunities, reacting more effectively to environmental or healthcare crises, such as COVID-19, and lowering costs in processes and services.

ePrivacy Regulation

The ePrivacy regulation was initially planned to come into force together with the European GDPR back in 2018. However, the process has been stuck in negotiations for almost 5 years. In the first quarter of 2022, the first draft of the regulation was completed. One year later, 2023, is expected to bring the ePrivacy Regulation into force, while the transition period will last at least 24 months. This means that organizations do not have to prove compliance with the regulation until 2025 but are encouraged to prepare in advance.

The ePrivacy Regulation (ePR) will replace the ePrivacy Directive of 2002. The Regulation aims to create privacy rules for communications services, and thus organizations, that are not covered at all or at least sufficiently by the previous privacy laws. Some examples of these businesses can be considered services such as Twitter, Instagram, TikTok, Facebook, WhatsApp, or Messenger.

The core focus of ePrivacy is to apply stricter rules to communication organizations regarding privacy. More specifically, stricter rules apply regarding the collection and processing of metadata, the category of data that describes other data. Users must give consent to access communications organizations to process the metadata generated from using the platforms. Consequently, rules regarding cookies will be clearer. Users will be able to accept or deny tracking cookies at a browser level while websites will not be required to ask permission for cookies that do not infringe privacy laws.

NIS 2 Directive

On November 28, 2022, the Council adopted the Network and Information Systems 2 Directive (NIS 2 Directive 2022/2225). All the member states of the EU must incorporate the NIS 2 Directive into national law within 21 months, from the day that the Directive entered into force. Thus, the full implementation of the new Directive is not expected before the end of 2023 or the beginning of 2024.

NIS 2 aims to enhance cybersecurity risk management and introduce reporting obligations in various industries in both public and private sectors, for example, transport, healthcare, energy, chemicals, food, or digital infrastructure. The goal of the Directive is to balance the cybersecurity requirements across the different member states. This will happen by setting a minimum set of cybersecurity requirements for European countries. In that way, the cooperation between authorities among the different states will be facilitated. 

The new NIS 2 is introduced with the aim of replacing the NIS Directive (2016/1148).  There are significant differences between NIS and NIS 2. Under the NIS Directive, the member states of the EU were responsible for determining which entities meet which criteria to qualify as operators of essential services. NIS 2 introduces a one-size-fits-all approach, where general rules apply to these entities. More specifically, organizations of both medium and large sizes that provide services covered by the Directive will be treated the same.

Organizations affected by the NIS 2 are expected to start preparing early, even before the official start of the Directive. More specifically, these entities must take measures for the technical, organizational, and operational actions they must take. They are expected to be prepared to manage security risks and incidents related to the services or products they provide, and their general business activities.

European Health Space

The project of European Health Space was introduced in March 2022, but considering the challenges that are ahead, there are a few years ahead until it is fully in action. However, the changes are gradually happening, and it is expected to be completed by the end of 2025. 

The European Health Space focuses on data of individuals in the healthcare sector. First, individuals will take digital control of their own health data. They will be able to freely access and move their health data at a national and EU level. The idea is to let European citizens access healthcare when traveling or living abroad as if they were in their home country. It will also empower the use of data for better healthcare delivery, research, innovation, and policy making. The potential of health data will be unleashed with a safer and more secure exchange and reuse.

However, there are many challenges regarding the adaptation to the European Health Space. For example, an important challenge is an inconsistency in digitalization in healthcare across the member states. Considering that, chances are many countries are not prepared for radical changes and will have difficulties in adapting.

Digital Services Act (DSA) & Digital Markets Act (DMA)

The European Commission introduced two Acts in the past years to encourage the protection of the users’ fundamental rights in the digital environment and to prevent unfair competition between very large and small businesses. These two Acts are the Digital Services Act (DSA) and Digital Markets Act (DMA).

Digital Services Act (DSA)

After a few years of negotiations, the Digital Services Act (DSA) entered into force on November 16, 2022 and the process is expected to be completed by February 17, 2024. The year 2023 will be the time when different provisions of the law will be gradually implemented across the European member states. 

The DSA includes all the services that are operating online, from small websites to large online platforms. The focus of the Act (and thus the businesses that are affected) is mainly on intermediary and hosting services, social networking & content-sharing platforms, and app stores. Moreover, platforms that link sellers with consumers (eg. Marketplaces, accommodation, or travel services) and large platforms that reach more than 10% of European consumers are also affected.

When there is a “substantial connection” between the business and the EU, then the Act is applicable. A “substantial connection” is considered the case when the business has an establishment with the EU, has a significant number of users in the EU, or targets business activities in one or more of the member states.

The DSA concerns illegal or harmful content that doesn’t meet specific requirements, published on these large communication platforms, for example, Google or Facebook. These platforms must remove this content with the mindset that whatever is considered illegal offline, should be illegal online.

Organizations must be transparent, be in a position to prove transparency if asked, and cooperate with national authorities. Moreover, organizations must ensure they have points of contact with these authorities and, if necessary, legal representatives. Finally, the terms of service must be updated to account for fundamental rights. 

In general, hosting services, and online platforms of all sizes, must ensure they provide appropriate mechanisms that will enable users to provide a notice to the organization, so they can take action when users encounter and report illegal content. Moreover, among others, online platforms are required to include a mechanism to submit complaints and handle them, provide transparent systems when it comes to recommended content, and provide transparency in advertising practices. Targeted advertisement to children or users with special characteristics is also prohibited.

In case a business is found to violate the DSA requirements, it can be fined up to 6% of its previous financial year’s annual global revenue.
In case an information obligation is violated, the penalty can reach a maximum of 1% of the revenue of the previous year or global turnover.

Specifically for very large platforms, the DSA suggests they prepare strategies for risk management practices and are prepared to respond to crises. It is important they also cooperate with the authorities during a crisis situation, and they share data with authorities and researchers.

Digital Markets Act (DMA)

The European Commission reached an agreement on the Digital Markets Act (DMA) on April 23, 2022. The Act entered into force on November 1, 2022 while the rules will start to apply by May 2, 2023. The businesses that are affected by the Act are expected to be in a position where they can prove compliance at the latest on March 6, 2024.

The Digital Markets Act (DMA) focuses on very large digital players, including businesses such as Facebook, Apple, Google, Spotify, and Microsoft. With the DMA the biggest, “gatekeeper” companies will be discouraged from imposing unfair competition. A “gatekeeper” is considered any company that is economically powerful, operates within the EU, targets markets in multiple Member states, and has a strong impact on the European market. Plus, “gatekeeper” businesses are considered links between a large number of users and businesses and are expected to keep having a strong position in the market in the future.

The DMA addresses many requirements for these large and complex businesses. Among the many requirements that should be fulfilled, is to not reuse data that have been collected for specific purposes without the user’s consent, or track users outside of the gatekeeper’s platform for advertising purposes without consent. Moreover, the “gatekeepers” are prohibited from promoting their products and services over an equivalent third-party product or service on the gatekeeper’s platform.

Pricing and fee transparency must be guaranteed in advertising intermediation services, while users will also gain access to marketing or advertising performance data collected on the platform. Users will be guaranteed data portability to other systems, and it will be easy to change their settings or uninstall the service at any time desired. Additionally, users, must not be discouraged from making complaints to authorities

If the gatekeeper businesses do not comply with the DMA, chances are they could receive a fine of 10% of their annual global revenue, with the number rising to 20% if the violations are repeated. In severe cases of repetitive violations, the fines might not have only financial impact, with forced divestitures imposed or banning them from acquiring other companies that provide digital sevices.

Upgrade your privacy approach

Book a free demo with one of our experts today! Don’t worry, they won’t bite.

Liked reading this article? Spread the word!

Get the inside scoop on simplified privacy management

Get exclusive tips ‘n tricks straight to your inbox. Join +1,100 privacy professionals already subscribed and stay ahead of the game!

Written by

Jessica Deneet

Privacy Consultant @ CRANIUM


Copyright © RESPONSUM BV

ISO certification logo