生物识别
计算机科学
人工智能
计算机视觉
语音识别
模式识别(心理学)
作者
Yitao Qiao,Wenxiong Kang,Dacan Luo,Junduan Huang
标识
DOI:10.1109/tpami.2025.3564514
摘要
Hand-based multimodal biometrics have attracted significant attention due to their high security and performance. However, existing methods fail to adequately decouple various hand biometric traits, limiting the extraction of unique features. Moreover, effective feature extraction for multiple hand traits remains a challenge. To address these issues, we propose a novel method for the precise decoupling of hand multimodal features called 'Normalized-Full-Palmar-Hand' and construct an authentication system based on this method. First, we propose HSANet, which accurately segments various hand regions with diverse backgrounds based on low-level details and high-level semantic information. Next, we establish two hand multimodal biometric databases with HSANet: SCUT Normalized-Full-Palmar-Hand Database Version 1 (SCUT_NFPH_v1) and Version 2 (SCUT_NFPH_v2). These databases include full hand images, semantic masks, and images of various hand biometric traits obtained from the same individual at the same scale, totaling 157,500 images. Third, we propose the Full Palmar Hand Authentication Network framework (FPHandNet) to extract unique features of multiple hand biometric traits. Finally, extensive experimental results, performed via the publicly available CASIA, IITD, COEP databases, and our proposed databases, validate the effectiveness of our methods. The SCUT_NFPH_v1 and SCUT_NFPH_v2 databases are available at https://github.com/SCUT-BIP-Lab/NFPH.
科研通智能强力驱动
Strongly Powered by AbleSci AI