Researcher argues for more data to tap ECG biometrics’ potential
Heartbeat biometrics have come a long way over the past two decades, as explained to attendees of an event hosted by the European Association for Biometrics, but more data will be needed to bring the modality’s theoretical inherent advantages to practical applications.
The latest EAB lunch talk was titled ‘I know you by heart: The past and future of ECG biometrics,’ and presented by João Ribeiro Pinto of the University of Porto.
Pinto is a leading researcher in electrocardiogram (ECG) biometrics for the identification of drivers and passengers inside vehicles.
Pinto introduced the concept behind ECG biometrics, including the five waveforms that measure the electrical currents that make the heart relax and contract. The current which effects the heart in this way spreads through other cells and can therefore be measured throughout the body through electrodes.
Because the heartbeat is affected by a range of factors, including emotions, fatigue and stress, it can be used to detect these characteristics.
ECG biometrics were also compared to other modalities, with the heartbeat’s complete universality, difficulty to circumvent, and relative distinctiveness and permanence, it is seen as effective for identification purposes.
Scenarios that require near-constant contact between the subject and a surface that can contain an electrode, like health watches and steering wheels, are the most popular for ECG biometrics.
ECG biometrics initially were based on fiducial (spatially represented) measurements of time, amplitude and angles, Pinto notes.
The first step in ECG biometrics is noise reduction, as the signal can be obscured by a range of elements, including the subject’s breathing. Feature extraction follows signal preparation.
The ECG data captured through traditional configuration of electrodes on the body in medical settings is now known as ‘on-the-person signals,’ Pinto says. In addition to electrode placement, this method is impractical for real-world biometric applications as the subject must lie down and cease activity for the duration.
The common method today for collecting ECG biometrics involves more comfortable electrode configurations, with fewer, metallic electrodes that allow for relatively free movement. These are referred to, somewhat misleadingly, as ‘off-the-person.’
This less intrusive method, however, introduces more noise and variability. The change prompted researchers to seek ways of measuring whole samples of ECG data, rather than individual heartbeats.
Efficient noise suppression and continuous authentication are regular features of ECG biometrics, and the latest work in the field applies deep learning models and end-to-end models to replace the traditional ECG signal pipeline. CNNs have showed higher biometric accuracy when applied to the whole ECG process than when applied after denoising, preparation, and feature extraction.
The field was constrained at first, however, by the lack of pre-trained models, which could fill the function that ImageNet or FaceNet play in face biometrics. Rendered as two-dimensional models, researchers found that ECG signals could be matched, but not with the same degree of accuracy as the wholistic CNN process.
Pinto also reviewed efforts into explainability, and recent studies that have shown ECG biometrics can in theory be spoofed and injected into commercial solutions as a presentation attack. Heartbeat signals remain one of the hardest personal traits to steal, however. Contact with both hands of the subject for at least a couple of seconds would be necessary to capture a useable ECG signal, he says.
The ‘secure triplet loss’ model improve template security by introducing cancelability and unlinkability without additional processes, according to Pinto.
Golden age ahead
Despite these improvements, and the potential of ECG biometrics as a modality that includes inherent liveness and is resistant to spoof attacks, heartbeats are hardly revolutionizing the biometrics field.
Practical performance is underwhelming, lab results are hard to replicate, and commercial applications remain rare.
Pinto suggests that the main problem at this point is a lack of data. Larger and more realistic datasets (from off-the-person setups), uniform benchmarks, and pretrained models would help, as would realistic evaluation setups and multimodal databases.