In the third in the series of videos with an inclusion and accessibility theme John Galloway considers some ways in which technology can support learners with physical disabilities. He briefly surveys a range of ways in which devices can be controlled, from the alternatives to keyboard and mouse, once the predominant way of operating devices in daily use, to voice control, blink and gaze control and lastly, looking to the future, but already developed, brain implants. A transcript of the video is available here (or below).
Transcript:
The distance between our minds and our machines is shrinking. Now, why is that important?
For people with physical disabilities it means that the opportunities available through the use of technology are amplified. When we first started using personal computers our interaction was determined by a keyboard and a mouse. So our capacity with them was determined by our dexterity with a keyboard and a mouse.Â
More recently we’ve become quite used to operating touch screens. So the variety of operational commands has increased, we can now tap and swipe. We don’t need the same kind of control as we might do with a keyboard.Â
There have always been alternatives to keyboard and mouse use around, for instance switches, big buttons that you tap. Typically a light will scan across the screen and when it comes to the point on the screen you want to activate, say a letter on a keyboard, you hit the button. Stephen Hawking was a switch user. He used, for a while, a system known as ‘sip and puff.’ Basically, he sucked and he blew and that made a light scan his keyboard then he selected what he wanted. And that’s how he wrote many of his books and delivered his lectures.Â
We are also, all of us now, using our mouths to operate our machines, in a slightly different way. We talk to our devices, we offer them commands, we tell them to remember things for us, we ask them questions, we tell them to connect us to a friend on the phone. And so the distance between the computer and what we are thinking about is now between our mouths and our brains.Â
Even more recently we’ve developed technologies where we control the device with our eyes. We look on the screen and when we find the area we want to activate we blink. Possibly the smallest motion the human body can make. In fact, we don’t even have to blink, we can just look and keep your gaze in one place and hover on that spot and the computer understands that that’s what you want to activate and ‘click,’ it does it for you. So now the distance is down to between the eyes and the mind.Â
And we are becoming very used to the notion of mind control, with brain implants. Literally we think, and our technology does what we are thinking about. It is making our thoughts tangible.Â
So the way in which technology – the way in which our ingenuity with technology – is shrinking that distance no longer means that technology is, literally, at arm’s length, now we can work from inside our own heads. Many of those disabilities disappear. And the opportunities that our devices offer us are available to them in a way that they haven’t been previously.