myNCBI Smart Hub
Nominated Award: Best Application of AI to achieve Social Good
Website of Company: https://www.ncbi.ie/smarthub/

NCBI The National Council for the Blind of Ireland, the country’s leading sight loss charity serving the approximate 55,000 people who are blind, or vision impaired to live confidently and independently. Our suite of programmes provides practical and emotional support, rehabilitation and opening pathways to education, employment, and full participation in community and public life. It means raising awareness to ensure children and adults with a significant sight loss have the same opportunities, rights, and choices as anyone else in society. NCBI works every day across the country with people of all ages, from young babies through to older members of our community. We provide services to approximately 7,000 people each year.
Cation
Cation Consulting is an AWS Consulting Partner who provide systems engineering, software services and Conversational AI services. With seasoned experience across the Telecoms, Finance, Insurance, Edtech and Travel industries, we aim to disrupt the end user conversation, bringing the user closer to any data or services across multiple channels, in multiple languages.
Reason for Nomination
Societal Issue
What is the societal issue/problem to be addressed?
NCBI as a charity who work in conjunction with the HSE to provide services to our blind and visually impaired Service Users are aware a portion of our demographic are often isolated and may not have access or the skills (orientation & mobility, White cane skills) currently to access our offices to avail of assessments, training, referrals, donations, and other services such as detailed medical information on sight loss and its conditions. This coupled with the initial Covid lockdowns further increased the need to NCBI to provide alternative avenues to our services.
We also have a large braille and digital library where we would have to post USB keys with digital newspapers and magazines to our Service Users. This service can use a lot of manpower and clerical staff to check a service users’ preferences, update USB keys and post them.
How does it manifest?
In our years of experience training people who are blind or people with have sight loss and low vision conditions we noted that while long term Services Users are technically adapt and can use screen readers and magnification to access digital materials, we were cognizant that most of our demographic are new to sight loss and would be in their 60s and suffer from Macular Degeneration. This demographic would also be new to technology and while they could learn to use Inclusive Technology we needed “path of least resistance” to technology for people in this scenario. Therefore, we opted to design and build the myNCBI Smart Hub on Alexa and Google Smart Speakers as we knew from research that most of our Service Users had one of these inexpensive devices in their homes already and Smart Speakers are a fraction of the cost of bespoke Inclusive Technology.
What is the scale of issue/problem?
The use of disruptive technology is paramount for organisations like NCBI to expand in the charity sector and further reach and help people who are blind or live with visual impairments. As we have mention there are 55,000 people registered as blind or living with visual impairments in Ireland. We know this figure is actually much higher as many people will not log their personal disabilities in a census. NCBI by providing a mainstream solution can really improve everyone’s access to services and information.
How does it affect people’s lives? Living with disabilities often leave people isolated and removed from people and services. Digital solutions are often not fully accessible and compliant with WCAG 2.1 standards and therefore even people with disabilities who are well trained and use the correct assistive technology equipment can often be block from information and services. This can feel very disempowering. It is for this reason we sought to find a universal solution for anyone with any disability and level of technical proficiency.
Solution
How will this issue/problem be addressed? The myNCBI Smart Hub was designed to remove the barriers to entry for people with a disability and provide easy access to information on sight loss and engagement with NCBI services.
The myNCBI Smart Hub is a voice assistant AI based application that runs on Amazon Alexa and Google Home. It allows services users and their families access to the latest information on sight loss by simply saying: “Alexa Launch myNCBI”, “Alexa tell me about the symptoms of Macular Degeneration”, “Alexa what are the symptoms of Nystagmus”, “Alexa make an appointment with NCBI”, “Alexa listen to the latest NCBI podcast”, “Alexa play my library book” or “Alexa donate to NCBI”. It is the first time an innovative voice AI based platform such as this has been developed in the charity sector and a completely new medium to access charity services.
Note the myNCBI smart hub is not built to be used just by NCBI. The NCBI Labs team developed this concept to be completely white labelled and implemented across multiple charities cheaply and quickly. This project will have profound impacts to support people with disabilities world-wide. We are already working with other charities to deploy this app to support their specific disability support.
Barriers to entry are significant for people with disabilities such as no access to transport and low employment. In a virtual world those with disabilities cannot sometimes afford laptops and computers. The myNCBI Smart Hub runs on a device that costs 40 euro removing the financial and location barriers as a restriction to service and support for people with a disability.
Additional Information:
What role does AI play in the solution?
The myNCBI Smart Hub is built upon components of Cation’s Parly Conversational AI native AWS platform (parly.ai) including Amazon AWS Cognito IDP for user authentication, Salesforce Integration for user helpdesk access and Parly’s audio CMS overlay integration for Alexa and Google Assistant. Alexa and Google Assistant are both very real applications of artificial intelligence that are increasingly integral to our daily life. They both rely on natural language generation and processing and machine learning, forms of artificial intelligence, in order to effectively operate and perform better over time.
How does AI enable the solution?
The NCBI Smart Hub utilises the concept of Conversational artificial intelligence (CAI) embedded into Alexa and Google Assistant. CAI can be defined as an intelligent mechanism of imitating real-life human conversations. This game-changing technology is made possible as it is built on the sturdy foundation of machine learning (ML), and natural language processing (NLP). By feeding in extensive volumes of data, the Smart Hub can be trained to learn the human interactions, giving it the power to recognise speech and text inputs while also translating its meaning into desired outputs. The Parly native AWS architecture is the infrastructure integrating all the components of the solution, training and correcting, monitoring usage and generating reports.
Is the approach novel/unique?
It’s very difficult to describe the innovation in a technology without saying over and over that this is the first time it’s been and done and how important it will be to those with disabilities across the world. From our research no other charity organisation is offering such a platform that allows people with disabilities access to these types of services without expensive bespoke hardware and software. From this same research we have garnered that private organisations also do not offer this level of support through a Voice Application and because of this we have decided to White Label our app and make it available to other charities and NGO organisations.
Voice technology is a new and emerging field and we found it would be adaptable to our needs in NCBI. We sought developers who are at the cutting edge of this exciting new technology, and they helped us build the possible voice app for people with disabilities. NCBI Labs then integrated
the Amazon AWS Cognito IDP in to all our existing platforms which created a single user login for all our services.
NCBI labs has legacy software such as Salesforce and our Helpdesk. We had to ensure all data from the app and the IDP would sync to Salesforce and record all interactions with our Service Users.