计算机科学
人工智能
模式识别(心理学)
特征(语言学)
面子(社会学概念)
近红外光谱
特征提取
提取器
翻译(生物学)
像素
计算机视觉
工程类
工艺工程
社会学
量子力学
物理
社会科学
基因
信使核糖核酸
化学
生物化学
语言学
哲学
作者
Huijiao Wang,Haijian Zhang,Lei Yu,Li Wang,Xulei Yang
标识
DOI:10.1109/icassp40776.2020.9054007
摘要
Visible and near-infrared (VIS-NIR) face recognition remains a challenging task due to distinctions between spectral components of two modalities. Inspired by the CycleGAN, this paper presents a method aiming to translate between VIS and NIR face images. To achieve this, we propose a new facial feature embedded CycleGAN. Firstly, to learn the particular feature while preserving common facial representation between VIS and NIR domains, we employ a general facial feature extractor (FFE) to extract effective features. Herein the MobileFaceNet is pre-trained on a VIS face database and serves as the FFE. Secondly, the domain-invariant feature learning is enhanced by proposing a new pixel consistency loss. Lastly, we establish a new WHU VIS-NIR database including varies in face rotation and expressions to enrich the training data. Experimental results on the Oulu-CASIA and our WHU VIS-NIR databases show that the proposed FFE-based CycleGAN (FFE-CycleGAN) outperforms some state-of-the-art methods and achieves 96.5% accuracy.
科研通智能强力驱动
Strongly Powered by AbleSci AI