Accessible technology and the role of Artificial Intelligence

Post date: 
Wednesday, 20 December 2017
Robin Spinks

A recent technology conference showcased the latest developments tech companies are making to ensure accessibility is a top priority for the next generation of digital services. Robin Spinks reports. 

From smartphones and tablets to interactive public kiosks and online banking, digital technology is becoming embedded in every aspect of our lives. 
But for almost one billion people who have a disability worldwide, there are challenges in gaining access to our increasingly digital connected world. And even with tech companies like Apple and Google building more functionality into their products, the majority of apps created are inaccessible to many disabled people. 
Fortunately, a growing range of mainstream accessibility solutions and an awareness of the importance of inclusive design are becoming more widely understood across the tech industry and among policymakers. Legislation such as the Equality Act 2010, which brought together existing anti-discrimination legislation to help simplify the law, also serves as a catalyst for positive societal change. 
This was the focus of a joint AbilityNet and RNIB conference, TechShare Pro, held at IBM Southbank in central London last month. 

Artificial intelligence 

We are now seeing the mainstream adoption of voice-driven artificial intelligence (AI) computing devices from companies like Amazon and Google, for example, Amazon Echo and Google Home.
For many disabled people, having a simple voice-activated means to operate a digital device at affordable prices, is reducing the barrier to entry when it comes to technology. But it is fundamentally important that the needs of disabled citizens are understood and incorporated into service design. 


Speaking at the conference, Jeremy Waite from IBM, the American multinational technology company, spoke at length about their amazing work on the system IBM Watson. 
IBM Watson, named after IBM’s first Chief Executive Thomas J Watson, uses AI to help companies to crunch and make sense of enormous quantities of data and use insights to support intelligent decision making. The premise of Watson is that no human alone can read, see, feel, hear and make sense of all the data transforming their work and profession
Watson was specifically developed to answer questions on the US quiz show ‘Jeopardy!’ in 2011 where contestants had to come up with the right questions to answers that were already supplied. Watson won against former winners of the show. Watson has access to 200 million web pages including the full text of Wikipedia but during the game show, it did not have access to the internet. 
Real life implementation of Watson is absolutely mind-blowing. Right now it’s in use around the world in the fields of insurance and banking. It is also in use with a leading lift manufacturer where Watson analyses data from millions of elevators around the world and helps to keep people moving safely and smoothly.  

Understanding and anticipating what the user wants 

Delegates at the conference also heard from accessibility experts from the BBC and Barclays, both of whom spoke about the impact that AI is having on their respective businesses. 
Both mentioned the opportunity AI offers in helping to create a user interface which itself understands and anticipates the needs of the user. For example, imagine a banking app which tracks your spending but also anticipates your outgoings based on your typical spending patterns. The app could literally tell you when to ease off on your purchasing based on imminent direct debits or standing orders. Functionality like this was once the stuff of Star Trek, today it’s actually coming to fruition in a way that will enrich all of our lives. 

Voice assistants

Conversational interfaces are digital products like the Amazon Echo or Google Home that mimic chatting to a real human. Right now, there are two types of conversational interfaces; there are voice assistants, which you talk to, and chatbots, which you type to.
Former RNIB digital accessibility superstar Kiran Kaja delivered the keynote address from Google. Kiran now works as Technical Program Manager for Accessibility at Google Search and he spoke about Google Home and the Google Assistant, both of which leverage the power of AI. 
Google Home is essentially a smart speaker which can be purchased from just £89 here in the UK. It enables you to get answers, play songs, check the news, enjoy your entertainment and control your smart home devices like your thermostat with just your voice. 
Critically, there’s almost no learning curve and you don’t need much computing or technology knowledge to get started. Many blind and low vision people are already enjoying it and competitor products like Amazon Echo. 


RNIB chaired a panel discussion on conversational interfaces which brought together experts from AbilityNet and O2. O2 talked about their really exciting partnership work with RNIB which will see the launch of the MyO2 skill for Amazon Echo users. The new skill allows Echo owners to retrieve and interrogate their latest O2 phone bill using just their voice – removing the need for paper or digital text to read.
TechShare Pro 2017 was sponsored by Google, IBM, Microsoft, Barclays, Orcam and Storm Interface. 
Robin Spinks is RNIB’s Innovation and Technology Relationships Manager.

Further information

Tags NB Online