Mobility in the 21st Century – Blog Part Two

Post date: 
Monday, 6 March 2017
Mobility in the 21st Century

In the second of his two blogs about technology and mobility, Robin Christopherson, Head of  Digital Inclusion at AbilityNet,  discusses the apps and gadgets that will be available in the not too distant future for people with sight loss. 

Read Part One of Robin's mobility blog

 
Good vibrations 
 
My feeling is that we are on the verge of a proliferation of smart wearables, devices that can be worn as clothing or implants, to help us in almost every aspect of our lives. Smart wearables aimed at helping mobility for people with sight loss will be no exception. Soon, talking, tapping or vibrating shoes, glasses or even rings won’t strike us as out of the ordinary.
 
There are a number of devices in development currently that use vibrations to help facilitate mobility for blind and partially sighted people. Haptic feedback or  "haptics", refers to the vibrations devices are now being designed to give off to convey information to the user. 
 
One example is the smart glasses Haptic Eyes which have been designed to send vibrations to the frames of a pair of glasses signalling the user to avoid obstacles. Creator, scientist Nguyen Ba Hai, plans to develop the sensors further so they can   recognise different colours.
 
Then there are the rather intriguing haptic shoes seen in action here. The shoes, known as  Le Chal, which means “take me there” in Hindi,  taps your feet as you approach a flight of stairs or when it’s time to turn a corner. It might sound amusing, but my Apple Watch already taps me as I approach a junction (it actually imitates a car’s indicator sound and also taps my wrist in time) so why should vibrating shoes be any more surprising? 
 

Autonomous vehicles

 

Driverless cars are just around the corner. The insurance industry wants them, Government wants them and I personally can’t wait to get behind the wheel, or in the passenger seat, or in the back seat (whatever takes my fancy) and take our next family car for a spin.

There have been concerns voiced about whether people with sight loss will be able to use driverless cars, but my view is that as full autonomy is at the very heart of the future model of driverless cars, we have nothing to fear. Sighted and blind people alike will be able to hail cars and similarly send them away to park themselves. We will have the choice to own a car or ‘call’ one to use temporarily at will.
 
Elon Musk, Head of Tesla, the American electric car manufacturer, recently announced that all new vehicles will be capable of full autonomy going forward. He said to simply await a change in legislation for users to be able to take advantage of this feature. It will be inexpensive to include additional sensors and the self-driving software, and when developed for the higher end models, will subsequently cost nothing to also include in models across the range. 
 

Wayfindr – the open solution to mobility

 
The benefits of having a smartphone that can take you to the very spot you need to be can’t be underestimated. In Part One of my mobility blog, I introduced the concept of beacons which can help people navigate their environment and identify their exact location to within 10cm. 
 

The challenge with beacons, however, is that people need to ‘beaconise’ the streets and buildings in order for it to work. Luckily, there are many commercial reasons why this is already being done. Many shops have beaconised their stores to let people know about the latest hot discounts, where public toilets and lifts can be found and to guide people to specific aisles they need to be able to find a jar of Marmite (or any other spread) in the supermarket. 

While many shops have been beaconised, different standards have been used by developers to determine how these devices broadcast their information and communicate with your smartphone. This has led to fragmentation and poor take-up of the technology to date. You also need to have the relevant app installed.
 
To solve this problem, Wayfindr, who recently won an award at the AbilityNet Tech4Good Awards and the Vision Pioneer Awards,  is creating an open standard which all companies and transport authorities can readily embed in their apps to assist users with particular mobility needs. The standard is essentially a set of instructions and code that can be built into an app, such as the Transport for London (TfL) app.
 
With this approach, I hope to see many more stations, bus stops and buildings beaconised in the future, with many apps taking advantage of the freely available functionality – and maybe Apple and Google will build it into their phones.
 

Unlocking cities for the blind

 
Cities Unlocked is a partnership project between Microsoft and Guide Dogs which creates a technology that accesses beacons to give people useful information about where you are and what’s around you. Microsoft put a lot of effort and ingenuity into making a headset with a built-in compass that allows the wearer to listen to a 360-degree audio description of where they are. This is amazing because the wearer can literally hear the direction of the bus stop, building entrance and landmark descriptions.
 
Here’s hoping that as technology develops, it might also include the new Wayfindr standard to enable the best of both worlds; the Cities Unlocked 3D-soundscape headset that can access every one of the numerous beacons that we hope will soon be popping up everywhere.
 

Taking object recognition to the next level

 
I’ve previously mentioned a number of object recognition apps such as Talking Goggles and AIPoly which have considerable capability when it comes to recognising objects and text.
 
Microsoft has been forging the way with an app called Seeing AI. The Microsoft developer Saqib Shaikh who lost his sight at the age of seven, has created smart glasses called Pivothead with a built-in camera that can provide a wide range of information about the wearers surroundings spoken on demand. 
 

Pivothead informs the wearer of objects in their surroundings, of people’s ages, genders, facial expressions and even facial hair. It also reads out the text of a restaurant menu. All this happens automatically even without the smartphone being taken out of the user's pocket.

The app is based upon Microsoft’s new suite of deep learning tools called Cognitive Services which are now available to app developers, enabling them to take advantage of significant advances in this area broadly termed ‘artificial intelligence’.
 
I can’t wait to get my hands on Seeing AI when it is finally released and I’m similarly excited about the other apps and services that this readily-available source of artificial cleverness will undoubtedly bring.
 
Further information
Tags Best of NB Online