抓住
人工智能
人机交互
触觉知觉
机器人
计算机视觉
计算机科学
感知
机器人学
对象(语法)
机器人末端执行器
心理学
神经科学
程序设计语言
标识
DOI:10.31979/etd.34cf-wyx3
摘要
Humans are considered very skilled object manipulators. Their capability to manipulate relies on their adaptability. They can change their grasp while interacting with an object. They utilize touch sensation to understand the amount of pressure required to grasp the object. They use their vision to decide the placement of their fingers to grasp effectively. There are situations where humans can perform manipulation blindly as well. Even though manipulation task is considered natural for humans, it is still challenging task in the field of robotics. The challenges associated with manipulation are the design of the end effector of the robot, grasp planning, capturing useful data from sensors and the formulation of an effective control strategy to perform manipulation. The design of the end effector has progressed towards an anthropomorphic design have the dexterity of human hands. As there has been a plethora of recent developments in image processing and computer vision a lot of researchers have utilized the visual perception of robots to perform object manipulation. The use of tactile perception to come up with a control strategy is still little known. Few researchers have suggested that using tactile information and visual perception can improve the performance of object manipulation. In this thesis, we aim to explore various ways to merge data from tactile perception with proprioception data to come up with object manipulation control strategies. We sought to determine if it is possible to complement partial proprioception information with tactile data for manipulation We were able to provide substantial results indicating tactile data can classify objects with distinct geometrical shapes as well as estimate relative object position during the interaction of objects and the end effector of the robot.
科研通智能强力驱动
Strongly Powered by AbleSci AI