DEKRA to conduct conformity assessment on high risk biometric systems under AI Act

The EU AI Act is approaching the August 2026 deadline, by which providers of high-risk AI systems, including biometric ones, must ensure compliance. Assessment and certification firm Dekra has announced that it is the first officially authorized organization to conduct conformity assessments for sensitive AI biometric systems under the rulebook.
The multinational company now offers the ability to assess remote biometric systems, which identify individuals at a distance; emotion recognition systems, which analyze biometric data to understand a person’s emotions or intentions; and biometric categorization systems, which classify individuals based on physical traits or behavioral attributes. The technologies are classified as high-risk categories under the AI Act.
Dekra received the accreditation from the Dutch Accreditation Council (RvA). The company is also accredited to issue eIDAS compliance certificates for trusted service providers.
“Being the first laboratory accredited under the EU AI Act means that manufacturers of AI Biometric Systems can rely on us to navigate the most demanding regulatory requirements – with confidence that their products meet the bar for security, reliability, and digital trust,” says Fernando Hardasmal, Dekra executive vice president and head of Digital & Product Solutions.
August 2026 is a significant point for the AI Act as the majority of rules come into force and enforcement begins. Another important deadline is August 2027, when developers will have to comply with additional obligations for high-risk AI embedded in regulated products.
Companies can expect hefty fines for certain AI practices: Up to 35 million euros (US$40.7 million) or seven percent of worldwide annual turnover, whichever is higher.
Dekra is providing the AI biometric system assessment as part of its portfolio of Digital Trust Services.
Article Topics
AI Act | biometric identification | biometrics | certification | DEKRA | emotion recognition | facial recognition







Comments