Hands-Free Human-Computer Interface
Format
SOECS Senior Project Demonstration
Faculty Mentor Name
Dr. Rahim Khoie
Faculty Mentor Department
Electrical and Computer Engineering
Abstract/Artist Statement
Our project aims to improve accessibility to computers for those who are unable to use traditional human interface devices (HIDs) such as the keyboard and mouse due to impairments such as Tetraplegia. To achieve this goal, we designed and built a device that utilizes a camera to detect the movement of the eyes and classify specific facial expressions. The resulting classifications are converted into USB HID signals which are then sent to a host device wirelessly via Bluetooth Low Energy (BLE).
We implemented our project using YOLOv5 models exported to the ONNX format to perform the expression and gaze detection firmware written in the Rust programming language. Our firmware parses the output from the detection models, generates the appropriate keyboard and mouse signals based on the order in which expressions appear, and acts as a BLE server which appears to the host device as a generic Bluetooth keyboard and mouse. The models and firmware run on the RISC-V-based Kendryte K510 single-board computer equipped with a neural processing unit. The K510 is then powered using a 10,000 mAh battery that is charged with either a 10 W solar panel or a USB power supply. The resulting device met all design requirements and provided a highly efficient and effective computing interface for those with accessibility needs.
Location
Chambers Technology Center, 3601 Pacific Ave, Stockton, CA 95211, USA
Start Date
6-5-2023 2:30 PM
End Date
6-5-2023 4:30 PM
Hands-Free Human-Computer Interface
Chambers Technology Center, 3601 Pacific Ave, Stockton, CA 95211, USA
Our project aims to improve accessibility to computers for those who are unable to use traditional human interface devices (HIDs) such as the keyboard and mouse due to impairments such as Tetraplegia. To achieve this goal, we designed and built a device that utilizes a camera to detect the movement of the eyes and classify specific facial expressions. The resulting classifications are converted into USB HID signals which are then sent to a host device wirelessly via Bluetooth Low Energy (BLE).
We implemented our project using YOLOv5 models exported to the ONNX format to perform the expression and gaze detection firmware written in the Rust programming language. Our firmware parses the output from the detection models, generates the appropriate keyboard and mouse signals based on the order in which expressions appear, and acts as a BLE server which appears to the host device as a generic Bluetooth keyboard and mouse. The models and firmware run on the RISC-V-based Kendryte K510 single-board computer equipped with a neural processing unit. The K510 is then powered using a 10,000 mAh battery that is charged with either a 10 W solar panel or a USB power supply. The resulting device met all design requirements and provided a highly efficient and effective computing interface for those with accessibility needs.