Summary: Researchers pioneered “acoustic touch” technology, allowing individuals with blindness or low vision to “see” using unique sound icons. These smart glasses translate visual data into distinct auditory cues.
Trials revealed that the glasses notably improved users’ abilities to detect and reach objects.
This groundbreaking tech offers a new avenue of sensory augmentation, promoting greater independence and life quality for the visually impaired.
Key Facts:
- “Acoustic touch” translates visual cues into unique sound representations, such as rustling for plants or buzzing for mobile phones.
- A study in PLOS ONE showed that the tech significantly boosted blind or low-vision users’ object recognition without excess mental strain.
- An estimated 39 million people worldwide are blind, with another 246 million having low vision.
Source: University of Technology Sydney
Australian researchers have developed cutting-edge technology known as “acoustic touch” that helps people ‘see’ using sound. The technology has the potential to transform the lives of those who are blind or have low vision.
Around 39 million people worldwide are blind, according to the World Health Organisation, and an additional 246 million people live with low vision, impacting their ability to participate in everyday life activities.
The next generation smart glasses, which translate visual information into distinct sound icons, were developed by researchers from the University of Technology Sydney and the University of Sydney, together with Sydney start-up ARIA Research.
“Smart glasses typically use computer vision and other sensory information to translate the wearer’s surrounding into computer-synthesized speech,” said Distinguished Professor Chin-Teng Lin, a global leader in brain-computer interface research from the University of Technology Sydney.
“However, acoustic touch technology sonifies objects, creating unique sound representations as they enter the device’s field of view. For example, the sound of rustling leaves might signify a plant, or a buzzing sound might represent a mobile phone,” he said.
A study into the efficacy and usability of acoustic touch technology to assist people who are blind, led by Dr Howe Zhu from the University of Technology Sydney, has just been published in the journal PLOS ONE.
The researchers tested the device with 14 participants; seven individuals with blindness or low vision and seven blindfolded sighted individuals who served as a control group.
They found that the wearable device, equipped with acoustic touch technology, significantly enhanced the ability of blind or low-vision individuals to recognise and reach for objects, without causing too much mental effort.
“The auditory feedback empowers users to identify and reach for objects with remarkable accuracy,” said Dr Zhu. “Our findings indicate that acoustic touch has the potential to offer a wearable and effective method of sensory augmentation for the visually impaired community.”
The research underscores the importance of developing assistive technology in overcoming the challenges such as locating specific household items and personal belongings.
By addressing these day-to-day challenges, the acoustic touch technology opens new doors for individuals who are blind or have low vision, enhancing their independence and quality of life.
With ongoing advancements, the acoustic touch technology could become an integral part of assistive technologies, supporting individuals to access their environment more efficiently and effectively than ever before.
About this visual neuroscience and neurotech research news
Author: Leilah Schubert
Source: University of Technology Sydney
Contact: Leilah Schubert – University of Technology Sydney
Image: The image is credited to Neuroscience News
Original Research: Open access.
“An investigation into the effectiveness of using acoustic touch to assist people who are blind” by Chin-Teng Lin et al. PLOS ONE
Abstract
An investigation into the effectiveness of using acoustic touch to assist people who are blind
Wearable smart glasses are an emerging technology gaining popularity in the assistive technologies industry. Smart glasses aids typically leverage computer vision and other sensory information to translate the wearer’s surrounding into computer-synthesized speech.
In this work, we explored the potential of a new technique known as “acoustic touch” to provide a wearable spatial audio solution for assisting people who are blind in finding objects. In contrast to traditional systems, this technique uses smart glasses to sonify objects into distinct sound auditory icons when the object enters the device’s field of view.
We developed a wearable Foveated Audio Device to study the efficacy and usability of using acoustic touch to search, memorize, and reach items. Our evaluation study involved 14 participants, 7 blind or low-visioned and 7 blindfolded sighted (as a control group) participants.
We compared the wearable device to two idealized conditions, a verbal clock face description and a sequential audio presentation through external speakers.
We found that the wearable device can effectively aid the recognition and reaching of an object. We also observed that the device does not significantly increase the user’s cognitive workload.
These promising results suggest that acoustic touch can provide a wearable and effective method of sensory augmentation.
Laura Adams is a tech enthusiast residing in the UK. Her articles cover the latest technological innovations, from AI to consumer gadgets, providing readers with a glimpse into the future of technology.