EG-Net: Appearance-based eye gaze estimation using an efficient gaze network with attention mechanism

计算机科学 凝视 人工智能 卷积神经网络 计算机视觉 面子(社会学概念) 眼动 特征(语言学) 集合(抽象数据类型) 姿势 任务(项目管理) 模式识别(心理学) 社会科学 语言学 哲学 管理 社会学 经济 程序设计语言
作者
Xinmei Wu,Lin Li,Haihong Zhu,Gang Zhou,Linfeng Li,Fei Su,Shen He,Yang‐Gang Wang,Xue Long
出处
期刊:Expert Systems With Applications [Elsevier BV]
卷期号:238: 122363-122363 被引量:7
标识
DOI:10.1016/j.eswa.2023.122363
摘要

Gaze estimation, which has a wide range of applications in many scenarios, is a challenging task due to various unconstrained conditions. As information from both full-face and eye images is instrumental in improving gaze estimation, many multiregion gaze estimation models have been proposed in recent studies. However, most of them simply use the same regression method on both eye and face images, overlooking that the eye region may contribute more fine-grained features than the full-face region, and the variation in the left and right eyes of an individual caused by head pose, illumination, and partially occluded eye may lead to inconsistent estimations. To address these issues, we propose an appearance-based end-to-end learning network architecture with an attention mechanism, named efficient gaze network (EG-Net), which employs a two-branch network for gaze estimation. Specifically, a base CNN is utilized for full-face images, while an efficient eye network (EE-Net), which is scaled up from the base CNN, is used for left- and right-eye images. EE-Net uniformly scales up the depth, width and resolution of the base CNN with a set of constant coefficients for eye feature extraction and adaptively weights the left- and right-eye images via an attention network according to its "image quality". Finally, features from the full-face image, two individual eye images and head pose vectors are fused to regress the eye gaze vectors. We evaluate our approach on 3 public datasets, the proposed EG-Net model achieves much better performance. In particular, our EG-Net-v4 model outperforms state-of-the-art approaches on the MPIIFaceGaze dataset, with prediction errors of 2.41 cm and 2.76 degrees in 2D and 3D gaze estimation, respectively. It also yields a performance improvement to 1.58 cm on GazeCapture and 4.55 degrees on EyeDIAP dataset, with 23.4 % and 14.2 % improvement over prior arts on the two datasets respectively. The code related to this project is open-source and available at https://github.com/wuxinmei/EE_Net.git.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
1秒前
1秒前
aiyoo发布了新的文献求助10
2秒前
math-naive发布了新的文献求助10
2秒前
LALALA卫卫J完成签到,获得积分10
4秒前
4秒前
heyan发布了新的文献求助10
4秒前
牧海冬发布了新的文献求助10
5秒前
Nolan完成签到,获得积分10
5秒前
6秒前
7秒前
yongziwu发布了新的文献求助40
7秒前
Hello应助Hu采纳,获得10
7秒前
华仔应助WXR0721采纳,获得10
7秒前
各生欢喜发布了新的文献求助10
8秒前
sponge发布了新的文献求助10
8秒前
11秒前
叶潭发布了新的文献求助10
12秒前
言欢欢发布了新的文献求助10
12秒前
17秒前
17秒前
20秒前
在水一方应助科研通管家采纳,获得10
21秒前
深情安青应助科研通管家采纳,获得10
21秒前
Owen应助科研通管家采纳,获得10
21秒前
浮游应助科研通管家采纳,获得10
21秒前
小二郎应助科研通管家采纳,获得10
21秒前
科研通AI6应助科研通管家采纳,获得10
21秒前
浮游应助科研通管家采纳,获得10
21秒前
Dean应助科研通管家采纳,获得30
21秒前
打打应助言欢欢采纳,获得10
22秒前
852应助Gds采纳,获得10
22秒前
22秒前
heyan完成签到,获得积分20
24秒前
orixero应助无情的傲玉采纳,获得10
26秒前
善逸完成签到,获得积分10
28秒前
科研通AI6应助火星的雪采纳,获得10
29秒前
大模型应助等待的忆安采纳,获得10
29秒前
32秒前
sunlight完成签到,获得积分10
32秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
An overview of orchard cover crop management 1000
Rapid Review of Electrodiagnostic and Neuromuscular Medicine: A Must-Have Reference for Neurologists and Physiatrists 1000
二维材料在应力作用下的力学行为和层间耦合特性研究 600
基于3um sOl硅光平台的集成发射芯片关键器件研究 500
A review of Order Plesiosauria, and the description of a new, opalised pliosauroid, Leptocleidus demoscyllus, from the early cretaceous of Coober Pedy, South Australia 400
National standards & grade-level outcomes for K-12 physical education 400
热门求助领域 (近24小时)
化学 医学 生物 材料科学 工程类 有机化学 内科学 生物化学 物理 计算机科学 纳米技术 遗传学 基因 复合材料 化学工程 物理化学 病理 催化作用 免疫学 量子力学
热门帖子
关注 科研通微信公众号,转发送积分 4810972
求助须知:如何正确求助?哪些是违规求助? 4124251
关于积分的说明 12761266
捐赠科研通 3860627
什么是DOI,文献DOI怎么找? 2125199
邀请新用户注册赠送积分活动 1146842
关于科研通互助平台的介绍 1040263