Abstract: Sight and touch are the basic sensory systems for human interaction with the environment. For blind amputees, one of the key challenges is how to quickly and intuitively convey information about the environment to restore their daily life abilities. Inspired by the ability of human auditory localization, we constructed a virtual scene almost identical to reality and at the same time added a virtual sound source to the interactive object. Using the spatial sound rendering (SAR) method, the three-dimensional movement of a virtual sound source can be simulated live in real time. Finally, a myoelectric prosthetic control system was developed to assist blind amputees in their daily activities. The Fitts' law test for target localization was performed on both SAR and voice guidance (VP) guidance methods, the results indicate that SAR significantly improves the information transfer rate. Prosthetic control test results show that SAR reduces the completion time by half compared to VP while restoring the natural grasp path. With the advantage of intuitive and rich perception, SAR demonstrated potential applications for blind amputees to reconstruct control and sensory loops.
Keywords: Object and Obstacle detection, Convolutional Neural Network (CNN)
| DOI: 10.17148/IJARCCE.2023.12572