EDPB Adopts Guidelines on Calculation of GDPR Fines and on Facial Recognition Technology in Law Enforcement

EDPB Adopts Guidelines on Calculation of GDPR Fines and on Facial Recognition Technology in Law Enforcement

Blog WilmerHale Privacy and Cybersecurity Law

On May 16, 2022, the European Data Protection Board (EDPB), the independent body of data protection supervisors that promotes consistent data protection rules and application thereof throughout the European Union (EU), announced the adoption of new guidelines on the calculation of administrative fines under the General Data Protection Regulation (GDPR) and new guidelines on the use of facial recognition technology in the area of law enforcement. Both sets of guidelines are subject to public consultation until June 27, 2022.

Guidelines on the Calculation of Administrative Fines Under GDPR

The EDPB’s Guidelines 04/2022 on the calculation of administrative fines under the GDPR aim to establish a single methodology for European data protection authorities (DPAs) to use in the calculation of a fine under GDPR. Identifying the “starting point” for the calculation of a fine requires considering three elements: (1) the “categori[z]ation of infringements by nature under Articles 83(4)–(6) GDPR”; (2) the “seriousness of the infringement pursuant to Article 83(2) GDPR”; and (3) the “turnover of the undertaking,” or business (i.e., revenue). 

From this starting point, the guidelines set out the calculation methodology in five steps:

  • ­Step 1: DPAs establish whether the case concerns one or more instances of sanctionable conduct, and if they have led to one or multiple infringements (see Chapter 3 of the guidelines);
  • ­Step 2: DPAs find the “starting point” for the calculation of fines, based on the methodology noted above (see Chapter 4 of the guidelines);
  • ­Step 3: DPAs evaluate aggravating and mitigating factors that can increase or decrease the amount of the fine (see Chapter 5 of the guidelines);
  • ­Step 4: DPAs identify the relevant legal maximums for the different processing operations as set out in Art. 83 (4)–(6) GDPR. Note that increases applied in other steps cannot exceed this amount (see Chapter 6 of the guidelines); and
  • ­Step 5: DPAs analyze whether the final amount of the calculated fine meets the requirements of effectiveness, dissuasiveness and proportionality, as required by Article 83(1) GDPR, and whether further adjustments are necessary (see Chapter 7 of the guidelines).

This methodology aims to establish a uniform approach to enforcement penalties across DPAs, to increase transparency and consistency in the calculation of fines, and to move towards convergence of average penalties. The EDPB’s approach as expressed in the guidelines will be tested before the Court of Justice of the European Union, which is currently dealing with several referrals regarding fines.

Guidelines on the Use of Facial Recognition Technology in Law Enforcement

The EDPB’s guidelines on the use of facial recognition technology in law enforcement provide guidance to EU and national lawmakers and law enforcement authorities (LEAs) on implementing and using facial recognition technology (FRT) systems. The guidelines note that “[FRT] has direct or indirect impact on a number of fundamental rights and freedoms . . . that may go beyond privacy and data protection, such as human dignity, freedom of movement, freedom of assembly, and others” and invoke the principles outlined in the EU Charter of Fundamental Rights (the “Charter”) (¶ 101), which requires such tools to be used only if necessary and proportionate, and the Law Enforcement Directive (LED), which governs the processing of personal data in the law enforcement context, to safeguard the right to the protection of personal data and the right to privacy (see Chapter 3.2 of the guidelines).

The guidelines also repeat the EDPB’s prior call for a ban on the use of facial recognition technology in certain cases, specifically:

  • ­Remote biometric identification of individuals in publicly accessible spaces;
  • ­Facial recognition systems categorizing individuals based on their biometrics into clusters according to ethnicity, gender, as well as political or sexual orientation or other grounds for discrimination;
  • ­Facial recognition or similar technologies to infer emotions of a natural person; and
  • ­Processing of personal data in a law enforcement context that would rely on a database populated by collection of personal data on a mass-scale and in an indiscriminate way, e.g. by “scraping” photographs and facial pictures accessible online.

These guidelines, if adopted, will impact the requests for data, software, and other technology that the EU and LEAs therein can make of private companies.

In the context of the European Commission’s proposal for a Regulation laying down harmonized rules on artificial intelligence (Artificial Intelligence Act), the EU is also currently debating whether to prohibit certain forms of ‘real time’ remote biometric identification systems (Article 5 of the proposal). The EDPB and the European Data Protection Supervisor have published Joint Opinion 5/2021 on this proposal.


More from this series


Unless you are an existing client, before communicating with WilmerHale by e-mail (or otherwise), please read the Disclaimer referenced by this link.(The Disclaimer is also accessible from the opening of this website). As noted therein, until you have received from us a written statement that we represent you in a particular manner (an "engagement letter") you should not send to us any confidential information about any such matter. After we have undertaken representation of you concerning a matter, you will be our client, and we may thereafter exchange confidential information freely.

Thank you for your interest in WilmerHale.