Getting your Trinity Audio player ready...
|
Navigating daily tasks can be a daunting challenge for individuals with visual impairments, who often struggle with object identification – a fundamental aspect of decision-making and independence. Recognising this pressing need, a team of researchers from the National University of Singapore’s School of Computing (NUS Computing) has unveiled a groundbreaking solution: AiSee, an innovative wearable device empowered by artificial intelligence (AI) to assist visually impaired individuals in ‘seeing’ and interacting with their surroundings.
For many, activities like grocery shopping are routine and effortless. However, for those living with visual impairments, such tasks can pose significant obstacles. The development of AiSee stems from a deep-seated commitment to harnessing technology for social good and addressing the challenges faced by individuals with disabilities.
Led by Associate Professor Suranga Nanayakkara from NUS Computing’s Department of Information Systems and Analytics, the research team embarked on a multi-year journey to conceptualise, design, and refine AiSee – a wearable assistive device tailored to the specific needs and preferences of visually impaired users.
At the heart of AiSee’s development lies a human-centred design approach, ensuring that the device seamlessly integrates into users’ lives while prioritising comfort, functionality, and discretion.
The concept of AiSee emerged from the team’s recognition of the limitations and stigmatisation associated with traditional assistive devices, such as glasses augmented with cameras. Understanding that many visually impaired individuals may be reluctant to use conspicuous devices, the researchers opted for a more discreet solution: a bone-conduction headphone that enables auditory interaction without drawing unwanted attention.
AiSee operates through three primary components: the ‘eye,’ the ‘brain,’ and the ‘speaker.’ The ‘eye’ comprises a micro-camera capturing the user’s field of view, while the ‘brain’ leverages state-of-the-art AI algorithms to process images in real-time and identify objects. Through advanced text-to-speech and speech-to-text recognition technology, AiSee facilitates interactive question-and-answer exchanges, allowing users to seek additional information about their surroundings effortlessly.
Unlike many wearable assistive devices that rely on smartphone pairing for functionality, AiSee operates as a self-contained system, offering users unparalleled independence and convenience. Moreover, AiSee’s bone conduction sound system ensures effective auditory feedback while preserving users’ awareness of surrounding sounds—an essential consideration for safety and situational awareness.
The development of AiSee represents a collaborative effort between NUS Computing researchers, corporate partners, and organisations dedicated to empowering individuals with disabilities. Through partnerships with SG Enable – an agency focused on promoting disability inclusion in Singapore – and the support of corporate benefactors, the research team aims to refine and enhance AiSee’s features, making it accessible to a broader audience.
Ms Ku Geok Boon, Chief Executive Officer of SG Enable, emphasised the transformative potential of AiSee and the importance of leveraging technology to empower persons with disabilities. By providing innovative solutions that enhance independence and quality of life, AiSee embodies the spirit of inclusivity and accessibility, aligning with SG Enable’s mission to create a more inclusive society.
As AiSee undergoes user testing and further refinements, it epitomises the power of interdisciplinary research and collaborative partnerships in driving positive social change. With its potential to revolutionise the lives of visually impaired individuals, AiSee stands as a testament to the profound impact of technology in fostering inclusivity, independence, and empowerment.