机器人
面部表情
计算机科学
反向动力学
人工智能
运动学
表达式(计算机科学)
图像(数学)
计算机视觉
动作(物理)
人机交互
人工神经网络
人机交互
经典力学
物理
量子力学
程序设计语言
作者
Tao Shen,Man Zhang,Ting Wu
标识
DOI:10.1080/09544828.2023.2301231
摘要
This study introduces a facial expression-driven interaction method for small quadruped robots, featuring three layers: Perception (L1), Analysis (L2), and Expression (L3). L1 focuses on data acquisition and image stabilization, L2 on model training and emotion classification, and L3 on control feedback and interactive movements. The core NXEIK network model comprises the NAFNet for image stabilization, Mini-Xception for facial recognition, EANet for action mapping, and an inverse kinematics model. Validation on a self-designed robot platform demonstrated the method's ability to enhance jittery image data and outperform deep learning networks like ViT and ResNet-101 in facial expression classification. The NXEIK model enables the robot to adapt its movements to various expressions using only three parameter types. This research provides a feasible solution for enhancing human-robot interaction and movement design for flexible quadruped robots through facial expressions.
科研通智能强力驱动
Strongly Powered by AbleSci AI