Field
AI-Enabled Solutions
Date
4-23-2026
Abstract
This research is to develop a customized robot arm capable of performing household tasks, by pairing a human-like robot hand with visual and tactile sensors, and used an AI-powered vision-tactile-action model to allow the robot to use these sensors to interact with objects and perform tasks to meet our ultimate goal which performs object-based household tasks by vision-and-tactile sensors and use the hand as the end-effectors of a humanoid robot while communicating with human interactively.
The novelty of our proposed robot hand is to have two thumbs rotating panning-tilting on opposite sides and two fixed tendon-driven fingers of the palm to efficiently and effectively GRASP hand objects, not like general robot hand with one thumb finger. A variety of different sensors working in unison are used to provide the robot with information: the main vision sensor is a fixed depth camera powered by a Jetson Orin Nano. Smaller cameras in the wrists to mitigate the effects of camera occlusion, and tactile sensors on the fingers and palm.
The hand's artificial intelligence model is ViTAL (visual-tactile-action-language) learning model, in which allows all the different sensor modes to work together. The Machine Learning framework is to let a robot identify and grasp an object by building off Tactile sensors like FeelSight and GelSight or DINO as well as capacitive and pressure sensors.
Recommended Citation
Casey, James; Mallappa, Harshitha; and Lee, Dongbin, "Artificial Intelligence-powered Novel ViTAL* Hand Robot for Domestic Tasks" (2026). Pacific Innovation and Entrepreneurship Summit (PIES). 33.
https://scholarlycommons.pacific.edu/pies/33