This course will cover all of the Android accessibility features up to version 4.3 on both tablet and phone devices. This course does not contain code for developers and is intended to help people learn how and why accessibility features are used. This course will cover:
1. Vision Features
2. Auditory Features
3. Physical and Fine Motor Features
4. Language Features
Hello, my name is Sami Rahman. This is Accessibility Features on the Android Platform. I am co-founder of bridgingapps.org. We are a community of parents, teachers, therapists, doctors, and people with disabilities exchanging information on how to use mobile device technology, like the Android platform, with people who have disabilities.
I am also the author of 'Androids for Special Needs,' which is a book that's coming out this fall. I am further the parent of a special needs child, Noah, who is five years old now. Noah has autism and cerebral palsy, and when he was two, he was below par cognitively and in language.
We got him a mobile device, a tablet device, and in conjunction with therapy and a really good skills-based program, we were able to bring his cognitive and language skills well above par. He's now a couple years above par. We did that in a very short amount of time.
In this presentation we're going to go through all the accessibility features that are on the Android platform. Whether that's a tablet or a phone. Let's get started.
The objectives of the course are six-fold. First, we're going to go through features that apply to visual impairments. Then we're going to go through features for hearing impairment, features for fine motor and physical impairment.
Then we're also going to focus on language features. At the end of the course I'm going to show you a couple of extra developers you want to keep an eye out for, and a resource list.
When we talk about the Android platform, unlike let's say the iOS platform where you have a single manufacturer who is creating both the hardware and the software, the Android platform is completely different. You have different manufacturers creating hardware, you have different manufacturers creating software. There certainly are some pros and cons when it comes to taking that into consideration when you're building a system around people who are going to use this to help manage a disability or help increase skill.
On the pro side, it's open source. That generally means that you have open standards of data exchange, whether that's media or text or whatever. It's not proprietary. It also means that a lot of people can be involved in the development, which yields to rapid growth and very rapid and very innovative technology that can be adopted fairly quickly.
What you also have is a lot of different hardware choices. Particularly when it comes to hardware, this can be a great benefit in terms of selecting the right hardware for your user. Every user, whether they're typical or not typical, has usability desires and usability issues that you have to take into account.
Having multiple hardware options can be a huge advantage in really customizing the experience for the user, for this particular user.
Now, on the con side of this is that rapid growth often means rapid change. I'll give you an example. Android is in version 4.0, 4.0 was launched less than a year ago, and now they're already on 4.3. Even before the manufacturers have had time to process the new version and get it onto their hardware that they've manufactured, there's a new version on top.
This again can be a very big positive, because new features come up very rapidly. But it can also mean that you have to keep on top of that.
Now the other flip side of this is because there are a lot of manufacturers of hardware, software gets released, the new version comes out, then the hardware manufacturer has to pick up the piece of software, understand and customize that piece of software for their particular hardware. The CPUs that are using it, the media cards, all the other subsystems that make up that tablet and have to customize it for that particular device.
Now, here's the scenario that often happens. You buy a piece of hardware, and two years later the manufacturer has moved on to another generation of hardware and they're not updating the previous generation of hardware.
Now, this happens much less with major manufacturers like Samsung, Asus, manufacturers like that. However, when you have off-brands, particularly cheaper off-brands, once they release the hardware and software they release it once, and they don't release updates to that hardware generally speaking.
When you're selecting your devices, take that into account. If you're not very technical, a major manufacturer's probably better for you than a minor manufacturer. Only because if you're a minor manufacturer, then you may have to be doing updates yourself.
Large text and font size. Now what's very interesting about large text and font size is they appear at different places in the settings menu on the Android, but they're the same feature. Let me explain.
Under accessibility, you will see an area called large text. When you click on it, you get three different options, small, medium, and large. When you go to the display area, you will get something called font size. In that you get four different options of font sizes. They're the same sizes, so huge in one is large in the other. It's a little bit; they're not quite in alignment.
However, the concept is the same. That is that when you activate large text or change the font size, your system's fonts, so this is all of the stuff that's in the header, up in the title bar, all of the menus, all of the settings, all of that will increase in size or decrease depending on what setting you have.
Large text and font size can also be used in conjunction with something called magnification gestures. What this means is, this is basically a zoom feature that's built into the operating system and can be used with any application on the Android platform.
In the accessibility menu, go in, turn on magnification gestures, and this will allow you to then further zoom into the screen.
Let me explain the parameters around magnification gestures. First of all, you've got to be able to go in and turn it on. Once it's turned on, you need to activate it. Let's say you turn on the settings and now you're in a program or you're on the desktop, and you want to zoom in an area of a screen. You need to activate magnification gestures, and you do that by triple-tapping anywhere on screen.
Wherever you triple tap is where the system will zoom in. It enlarges to a certain percentage, and then if you want to further zoom in, you can use the pinch gesture. If you put your fingers together and move outwards on screen while touching the screen, it will zoom in. If you have your fingers apart and move inwards while touching the screen, it will zoom out.
If you need to move around screen because now you've enlarged your screen, you use two or three fingers and you touch the screen and you can pan around the screen by moving your fingers around. That's how you use magnification gestures.
Again, you can use this in conjunction with large text, who have a visual impairment, who need something larger than let's say 50 point type. It also allows you to, again, for those who just need to look at some portion of the screen, these are really high res screens, there's a lot of information on it. I would argue that; I don't have necessarily visual impairment, but I use magnification when I need to look at something closely. I leave it turned on.
Speak password. This is a very unique feature. Android with iOS4 has spent a lot of time working with those who are visually impaired to be able to not only set up the device if you have a visual impairment-you can do that completely without being able to see the device.
Also, all aspects of security and usage really have incorporated the idea of vision impairment. This is one of those features that allows you to still maintain security on your device, but allows you to get feedback when you're adding your password.
Let's talk about setting it up first. It's under the accessibility tab. You have speak password, and you can turn on or turn off. Now, there are many different locking mechanisms on the Android device. This only applies when you have a password. If you use a slider or don't use a password at all, even if this is turned on it will not work because you don't have a password.
If you have your password set and turned on, and you have speak password turned on, when you go to unlock your device, as you're typing your password it will tell you each letter that you are typing.
This can be good if you have a visual impairment. This can also be a security risk, and you've got to think about where you're doing this. You wouldn't want your password being spelled out in public. That's the security risk side of it. You get the auditory feedback of the password, but potentially other people can hear it.
Voice search. Voice search is a feature of the Google platform in general. It's a Google search, a voice search. There are two parts of it. One part is the standard part of Google search is that you tap on the record icon, which is actually in a number of places on the operating system. It's on the home screen, the default home screen, every time you do a search. It's on most of the keyboards, and we'll get to keyboards later on in this presentation.
There are a number of different ways you can activate voice search. You tap on the microphone button, and the device starts listening. Then it takes the audio, and then sends it to Google's servers, if you're connected to the Internet. Or you can download on your Android device an offline version that will process on the device itself, so you don't need to be connected to WiFi or to some sort of network to be able to use this.
It interprets your text, and then allows you to not only search the Internet, but it also allows you to issue commands to the device itself. In this particular screenshot, I asked for it to open the Android default settings.
What you're seeing here is you're seeing the settings, and then there's a small toolbar. It gives you some time and it counts down. If you do nothing, it will actually launch the settings. It gives you the time though, if you want to do something else, that you can reissue another search. It doesn't automatically shoot you to settings in case you didn't want to go there, or it misinterpreted what you were saying.
There are some built-in fail safes to reduce user frustration and to increase accurateness of the interpretation engine. I like voice search; very well-developed software. One of the problems with voice search in general when it comes to people with impairments, particularly if you have a language impairment or a speech impediment, is that voice search isn't very accurate.
Even for me, I don't think I have a speech impediment, but when I use other technology's voice searches, they're horrible, they just hate me. They never get what I want. I don't know if I have a slur or what's going on, but it never gets the results that I want.
The other thing, and with Google's voice search, it's very accurate, very, very accurate. It seems to always get what I want, or if it doesn't it gives me a bunch of alternatives that I might not have thought of.
It's not magic, there are times when it gets it wrong. But because it can go up to a big server and it's constantly getting updated, the voice technology's getting better and better and better.
Now, there's some underlying technology. You can also give Google permission to start compiling information around your voice, so that it gets better particularly with your voice. That is good for not only people with speech impediments, it gets more accurate, but also people with strong accents or using a second language. Potentially you get better search results out of this.
Again, not perfect. It's getting closer to the dream of living, talk to the device and it does magic for you.
Gesture search. If you think about it, if you have a visual impairment, particularly if you are completely blind, how do you see a keyboard? Especially an onscreen keyboard, how do you use an onscreen keyboard? There are a number of strategies for that.
One strategy is the gesture search application. You have to download it, it's not a part of the base operating system. I put it on my desktop, and you can tap the icon. Then it gives you a blank canvas and you start handwriting what you want and it will interpret your handwriting. This is handwriting recognition as well as it does accept some gestures. Then it starts searching. It will search your contacts; mostly it's local device search.
Volume settings. In the settings application under sound, under the volume tab, you're going to find three different volume adjustments from zero to 100%. The first one is music. This is all your media, your music, your videos, your games. The second one is notification. This is going to be things like email updates, software updates, application notifications. Then alarms.
Alarms are things like calendar entries. An appointment you're about ready to go to, or one that you've missed. Being able to adjust particularly notifications and alarms, having those as separate volume settings, you can set up for a user who may have a visual impairment. You can set up the alarms to be full blast and set your notifications down to be very, very low, or not at all.
For example, for me personally, while I don't have a visual impairment, I get so much email I set my notifications down to zero, but I turn my alarms way up.
Then when it comes to the video games, one of the biggest problems about media in general is some media may be very, very quiet, the next piece of media, let's say two videos side by side, the next video is really, really loud. You need to adjust that within the actual player itself, and all the players will give you the capability to adjust that.
What there isn't in the system is an overall volume adjustment or max limit to the volume. This would be particularly used in the opposite scenario. I'll give you an example out of my own personal life. My son likes to jack up the volume of all the videos so that it's so loud.
Now, there's no way to completely limit the sound. It will play as loud as it can play. What we do is we take tape and we mask over the speaker to reduce the volume.
TalkBack. TalkBack is, if you were on the iOS platform, that would be voiceover. TalkBack is an auditory cursor. It allows those with visual impairments to get auditory feedback as they move the mouse or their finger around the screen.
You go into accessibility settings, you turn it on, and there are a number of features within the TalkBack system. One of the other nice things about TalkBack is, TalkBack, when you first get the Android device and you're setting it up, TalkBack can also be turned on in the actual set-up of the device itself. So that if you have visual impairment and you get shipped a device and you're setting it up, you do not have to have a person with vision to be able to set up the device. You can do all the set up using the TalkBack feature activated.
Now, there are also a number of features that require the TalkBack feature to be activated as well. They're sub-features of TalkBack, and we'll get into those later.
Basically the way TalkBack works is a very elaborate feature, so there's no way I'm going to be able to do the entire thing justice, but. As I swipe over an icon or as I drag my finger across the screen, it will start giving me auditory feedback, where my finger is.
It allows me to navigate both within the main screen areas, and also within an application. Then once I have found what I'm looking for, I double-tap to activate it. Instead of single tap, when you see an application you want to activate it, you tap on it, TalkBack uses a double-tap system because first tap is to identify where you want to be. The second tap is to activate it.
What is also nice about the TalkBack feature is, when you turn it on for the very first time, it gives you a tutorial. Oftentimes accessibility features, particularly elaborate accessibility features, when a user's first getting introduced to them, without any sort of tutorial, they get lost pretty quickly. I love the fact that there's a tutorial you can always run to started with TalkBack.
BrailleBack. BrailleBack is an accessibility service that is downloaded, additional to the operating system. For example, when I got my Android tablet, I had to go out and get BrailleBack and download it.
BrailleBack allows a Bluetooth Braille interface, and there's a list of probably a dozen or more Bluetooth interfaces that are compatible with BrailleBack. You go in, you install BrailleBack, you go into the accessibility settings area, and you turn on BrailleBack.
There are a number of customization settings for your device. Here are the key concepts. Once I have paired my Bluetooth Braille display with the tablet, and I have set BrailleBack up, I can now use the Braille interface. The Braille interface now becomes a bidirectional interface for the Android.
Firstly, as I'm reading for example a web page, the text that I'm reading back, that is underneath my finger, will actually be displayed back on the Braille display that I'm reading through TalkBack, will actually be displayed on the Braille display.
Those who have vision impairments and hearing impairments, so if you're completely blind and you're completely deaf, or if you rely upon Braille displays, this is a perfect tool for 'reading' the tablet or the phone itself.
Now the other great thing about the BrailleBack and the Braille system within Android is it also allows you, if the Braille display has a Braille keyboard, it allows you, when you're let's say responding to an email, instead of using the onscreen keyboard, it actually will default to the Braille keyboard, and you can use the Braille keyboard to type in your response instead of the onscreen keyboard, and then hit enter and you can go from there.
It's a bidirectional display, so it's input and output device, very well thought out features. It's a fairly new feature, but it's fairly mature. Very exciting, the Braille capability of the Android platform.
Features for the hearing impaired. Text to speech; let me explain what text to speech is, and then let me explain how you access it within the Android environment.
Text to speech is, I type something on my device, and then I use the device as the voice for my text. For those with hearing impairment, text to speech can be a way of interfacing with people who have hearing.
It can also be-and we'll get to this from the language perspective-very useful for those with reading impairments. I personally have both dyslexia and dysgraphia, and I use text to speech on my tablet and my cell phone when I'm having a hard time reading information. I'll use text to speech as a reinforcer to reinforce what I'm trying to read to make sure I've got it properly.
That is what text to speech does. It takes text and converts it into audio. The way that you access it within the Android system is through different text to speech applications. You can download a text to speech application, they're free on the system, you can also pay for them to get nicer voices.
Basically you either type text in, or for example if you have web accessibility turned on, you can highlight portions of the text on a web page or use TalkBack to take the text that's onscreen and repeat it back to you.
Text to speech is the technology. It appears many, many different times within the operating system, in different forms. Some in application form. The reason why I put it under hearing impairment is if I am a deaf user and I want to speak, I can use application like Talk to type in and then have the system speak for me.
Or I can turn on TalkBack to have the system talk back to me and read back to me the text that's on screen. That's text to speech.
Closed captioning. As of Android 4.1 or above, the default media player on the Android system can take an MPEG-4 file that has embedded captioning, closed captioning in it, and actually display that captioning on screen.
You're actually seeing here, this is a screenshot of my Android phone playing back one of the videos I did that was captioned. Here's the caveat. Not all Android devices use the default Android media browser.
Just because you have an Android that's 4.1 or above doesn't mean that it will pay the caption file back for you, and it'll play all captioned video files. I am absolutely positive it plays MPEG-4, which is the basis of most of what's on the Internet is MPEG-4. If it's captioned within MPEG-4, it'll play.
Now here's the good news in that story. If you have a device, and the media player does not use captioning, is the beauty of the Android device is go find one that does. You can go to the Google Play store and download all sorts of media players that caption, whether it's Android 4.1 or above, or Android 4.1 or below. You can go download one and watch your videos through that and it'll caption.
There's closed captioning on the system both at the operating level and third party developers have developed caption players as well.
Headphone jacks. If you have partial deafness, the headphone jack can actually be used in conjunction with headphones that will actually amplify the volume. Now in reverse, for example, my children, when they use a headphone jack, I don't want them jacking up the volume on the device so high that when they put the headphones on, it's going to do hearing damage.
I use headphones for kids that have limiters on it so that no matter how high they jack up the volume, it will never go up to a certain point through the headphones, so it won't ever damage their ears no matter what they do.
Regardless of that, the headphone jack allows you to either put amplification headphones on or limiting headphones, amplification speakers on or limiting speakers, as well as particularly when it comes to the portable cell phone devices, the headphone jack is a three stage jack.
Again, this depends on your manufacturer. Different manufacturers implement this differently, but a number of manufacturers will actually implement a headphone jack that you can raise on the headphone itself, or on some kind of control pod, either on the cable or in the wireless system, will allow you to up the volume or lower the volume as well within the headphone jack.
A lot of them also incorporate the ability to add a microphone. The headphone jack can raise and lower volume in most cases. It can send sound out from the device, and it can also act as a microphone input as well.
It all depends on the manufacturer, but in terms of the operating system, the operating system's capable of doing all of that.
Bluetooth audio. As we've already seen with Braille devices, Bluetooth is a wireless communication system that allows you to connect a variety of different, both input and output devices to the Android environment.
When it comes to audio though, just like the headphone jack, you can connect to the Android environment wirelessly using Bluetooth. If the speaker's compatible with Bluetooth, if the headset's compatible with Bluetooth, then you can use that to either increase the volume or control and manage the volume.
Bluetooth, because it's also bidirectional, will also allow you to use headsets that also have built-in microphones as well. As long as they're supported by the operating system.
Visual alerts. From the very beginning, the Android system has had a notification system built into the core operating system. That means developers didn't have to develop their own way of sending alerts to the screen or to the notification system. They just tapped into the current technology that was already built there.
The way the alert system expresses itself or notifies you, there are auditory alerts along with visual alerts. The visual alerts generally speaking express themselves in the alert system. Onscreen, you'll see a screenshot of, this is an Android tablet. I've dropped down the alert system on the left hand side, and you'll see all of my alerts.
The alerts are anything from emails to however I have set up each individual program to alert me, and the kinds of alerts that I want in that program, then it will present this information in this one, central area.
Now what's really nice about the alert system within the Android environment is you can actually execute commands within the alert. For example, I can actually reply to an email without actually launching-I mean, I'll eventually launch my email, but I don't have to see the email, then tap on the email to launch the email application to read the email to then reply to the email.
I can just see the email in the alert system, and if I want to reply to it immediately I tap on reply and it'll open up the reply window and I just start typing. Each program within the apps area has settings to be able to control how and what gets alerted and when.
Further, in most of the programs within its individual settings, you will also find even greater choices in most programs. You have to look at the programs. There are a lot of different options. There aren't a lot of restrictions on this in terms of the developer, so the developers can do all sorts of visual alerts. It all just depends on what the developer wants to do.
Multiple orientation and auto rotation settings. I'm going to use my son as an example. The way his cerebral palsy expresses itself, particularly early on, he basically used his hands in a fist; you know, balls.
Early, early on when he was using a tablet, he didn't care whether the screen was facing him or not. Now that he understands that the screen has information, he's starting to read and do all these other things. He wants to make sure that the screen is actually rotated the right way. He actually uses his hands. I often see him flipping the screens around.
Now one of the things that you can do is, you can, within the operating system, turn the auto rotation system on or off. If you've got a user who gets particularly frustrated with the screen rotating all over, particularly if they have it in their lap, you can show them how to turn on and off auto rotation or just turn auto rotation off completely. That allows you to fix the screen in one direction. It won't keep rotating, either portrait or landscape.
The reason why the multiple orientations are so important, particularly when it comes to fine motor and physical impairment, is let's say you mount the device to a wheelchair. That's generally a fixed mount and isn't going to allow you to rotate it portrait or landscape.
Now, there are wheelchair mounts that will allow you to do that, but sometimes the users don't have that physical capability of doing that. You're going to want to fix the screen in one orientation so that it doesn't keep rotating on the user or that it's more usable for the user.
Now when I first saw touch and hold delay, I was ecstatic. One of the problems that I have seen with my son, particularly because he . . . well, now he's got great fine motor control, but as I've seen him develop the fine motor control, he was not always able to touch the icon, either long enough or what often happened was he touched the icon too long.
What I think is absolutely brilliant about the touch and delay, there are three settings in terms of sensitivity. What this accessibility setting does is it tells the operating system how long between the time the user touches it and the user lets go before it considers it a touch and hold.
When you think about gestures and you think about interfacing with a touch device, if you touch something, if you tap on something, that's one kind of action. If you tap and hold on something, that's another kind of action.
If you can increase the tap and hold delay, that is very beneficial for some users who don't have the fine motor to be able to do a quick tap. By increasing that delay, they can do a tap that is more suitable for their physical capability, and still does not count as a tap and hold. The longest one is actually fairly long.
Again, every user's completely different and you're going to have to test this out, but I love the fact that this feature is built into the operating system so that every button, you have the longer capability so the user doesn't get false positives or false results.
This also plays into when users touch the screen inadvertently with, let's say the side of a hand. One of the problems that my son often has is he accidentally touches the screen with the side of his hand and it activates something.
While this doesn't take that away, it does prevent it from deleting applications and so forth and so on.
Multiple keyboards. As you can see here, there are three different kinds of keyboards that you can make as the default keyboard. Actually when I installed my tablet, my tablet actually came with three or four different keyboards.
Let me give you this concept first. That is, you can install multiple keyboards on the Android environment and make them the default keyboard. Within the keyboard, you have a number of different settings depending on what the keyboard does.
Onscreen, I've got three different kinds of keyboards. One in the lower right hand corner, I've got the default Android 4.3 keyboard. This is Google made keyboard, and you can also see that I've swiped. In my swiping pattern, I have typed the word 'duck.'
In the middle at the top, the 'hello world,' you can see a different type of keyboard. This is a handwriting to text keyboard that allows me to draw the letter that I want. The third keyboard on the right hand side is a swipe and gesture keyboard that allows me to swipe and gesture.
Notice also that there's a microphone on there. There's a microphone not only on the swipe keyboard, but there's also a microphone on the Google keyboard. It allows me to then launch the speech to text function of the operating system and do a voice search as an example, or voice typing.
If the default keyboard doesn't work for your user, there are a lot of different third party keyboards that you can download and try, and you can find keyboards that work for them specifically. Some of these keyboards are even programmable to do custom gestures just for your user to make interfacing and input perfectly set up for them.
Switch capable. The Android platform does not have a standard operating system level switch interface. Mainly because there isn't a lot of open source and standardization around switches. Switches are often customized for the switch user.
On screen you've got three things. You've got a switch, you've got a switch interface by Komodo, OpenLabs, and then you also have a switch interface by an Irish company called ClicktoPhone.
There's a couple of different switch interfaces out there. Here's what you need to know about switching. Because it's on an operating system level, then three things must be true to make switching work for your user.
First of all, the physical switch has to be compatible with the switch interface. That's step number one. Most of these are DC, low volt switching. That's fairly standardized. A lot of switches or a lot of switch interfaces accept a lot of different kinds of physical switches. This is really good if you mount one on a wheelchair, or you're trying to tie your switching into an existing system.
The switch interface then has to be compatible with the operating system. A switch interface that's only designed for iOS is not compatible with Android. You have to find a switch interface that's compatible with Android. That's the second thing that has to be true.
The third thing that has to be true is the software you're running on the Android device also has to be switch enabled. Now, this is a little bit more gray than, let's say, the iOS environment. That is absolutely true with the iOS environment. The application has to understand the switch interface.
In the Android environment, this is a little bit more gray. Generally speaking, you want to make sure that the software is compatible with that switch interface. Absolutely. However, manufacturers, because it's an open source environment, have a greater latitude to do operating system level software.
There are a number of switch interfaces that do operating system level stuff as well as application level software. Depending on the interface, you can also control the operating system and launch different programs, or go from one program to another, or different features that are outside of a single program through the switch interface.
What I can tell you is this. Make sure all those three things are true. The application that you want to use is compatible with the interface, and the interface is compatible with the switch. The interface is also compatible with the operating system obviously. Make sure all those things are true and test.
Like switching in the physical world, switching electronically is very customized to your particular user, and so test, test, test.
Pointer speed. This is a feature within the settings area that allows you to control the speed at which a pointer moves across screen. The reason why this is significant, first of all, within the Android environment you can plug a mouse or a joystick or some other interface device and use it as a pointer within the system, because the Android system does have a mouse pointer.
Now what's significant about that is not only for those with fine motor and physical impairment, using a joystick or a trackball may be much, much easier, an adapted joystick or adapted trackball, than using their finger across screen.
Right there, that's a huge benefit. It also paves the way for those with no physical capability in terms of movement to be able to use eye gaze or eye tracking technology to use the Android device. Because the pointer allows them to know where they are looking on screen.
Let's talk about pointer speed. The specific feature is this. It allows you to slow down or speed up the rate at which the pointer moves across screen. If you have a very sensitive pointer, like a joystick or a mouse, you may want to slow the speed down so that the user doesn't shoot that mouse off screen too quickly and become frustrated.
Or likewise, if you have a very insensitive input device, you may want to speed up the pointer to make it more efficient for the user to move the mouse across screen.
This basically allows you to, it's really to fine tune the user experience and allows you to use a broad range of equipment to attach to the tablet or the phone to make the phone more usable for that user.
This is one you should be pretty familiar with if you have a smart phone, and that is your ability to add your own words to a personal dictionary. This is really important if you have keyboards that have spellchecking in it, or applications that have spellchecking in it, if you're using let's say jargon or technical terminology or SMS messaging stuff. I don't understand any of that.
Whatever it is, you can add your own custom words to the dictionary so that the spellchecker knows that they're right, and/or potentially can suggest them if you have other features turned on which I'll get to later.
This is a personal dictionary. The way that it works within Android is within the settings area, you tap and add your language to it. Then also applications can also add language to your dictionary as well.
Keyboard settings. Under language and input, you're going to find the keyboard and input methods. These are basically drivers that allow you, as we've discussed earlier, if you have multiple keyboards that you have installed on your Android device, this allows you to customize the settings for those keyboards.
These can be everything from a gesture keyboard to what you think about, an onscreen keyboard. Within each keyboard then you have a number of settings on top of that. The message here is, look at your keyboard, test your keyboard.
I was shocked at the sheer amount of settings that are in each keyboard, just the default ones that are on Android, and then when you look at special needs keyboards that are designed for the environment, then they have even more features. This is how you access the keyboard setting.
Now autocorrect and show correction, these are two features that are built into the default Android keyboard. I will say this, not all keyboards-for example, a gesture keyboard would not have auto correction or show corrections on there.
I did, because this is the default keyboard, and let's say you're interfacing a wireless keyboard with this, most people will use the default keyboard. I wanted to highlight some of these default features.
Auto correction, it's a turn on and off, and show correction is turn on and off concept. You can see here I've typed in 'can you' is what it's supposed to be, and what it has done is if I hit the tab bar, it will actually change the misspelled you, 'I-U,' to the correctly spelled 'Y-O-U.'
You can also see that it's showing me that, by underlining it, it's showing me that it's not correct. It's making some suggestions as well. 'You,' the word that I typed in, and then 'guy' is the other suggestion.
Within the default keyboard there's also the concept of gesture typing. Sometimes this is called swipe keyboards. This allows you, for a user who let's say doesn't have enough fine motor control for hunting and pecking, this allows you to swipe across the letters that you want, and it will use its predictive text engine to come up with the right language.
Right here you can see I've actually swiped the word 'duck,' and what's interesting about this is I started at the word D, I went to Y, which is a misspelling, and then to C, and then ended almost on K but didn't quite get to K. D-Y-C-J is what I spelled using the swipe action, but using the auto correction and the dictionary technology in it, it provided me the actual word I was looking for, which was 'duck.'
The next feature within, again this is the default Google keyboard, is the next word suggest. You can turn this on and off, but you see here I've typed, 'can you guess what I'm going to type N-E-X,' and it suggested the word 'next.' I can then just tap on the word 'next' and it'll do that, or I can, if I hit the spacebar, it will actually, because the word next has got those three lines underneath it, it will just automatically put that text in for me.
Extra features on the Android platform. Eyes-Free project. This is a Google project that is really concerned with visual impairment issues. They have 12 different apps. They have everything from text to speech, they have a talking dialer. They are the ones, this is the subproject that built BrailleBack.
This is a project of Google, it's an open source project so anyone can be involved in it and download the source code and modify the source code. When you're in the Google store, the Play store, keep an eye on Eyes-Free. They are constantly developing software and new software for visual impairments, and all their software's free.
The next one is Code Factory. Now their software is not free, and they have a lot of similar applications as the Eyes-Free project, but they're very, very active. They have very mature software. They do have a phone dialer as well. They do an accessible music player.
What I think is also really great is they do an accessible web browser. Some of these are free, some of these you have to pay to download. Some of them you can download, test them out, and then if you like it, pay for it.
The advantage of paid software is generally speaking you know that it's going to be developed, it's going to be built upon, and it's going to keep getting better and better and better. Whereas some free software is developed for a little while, and then they stop developing it and you're sort of stuck.
There's some advantages. I use both free and paid software. I love them both, it's just something you need to manage. Code Factory's one you want to keep an eye out for as well. They have a number of applications.
The IDEAL Group. This is a group that really works with the device manufacturers to build accessibility features into the operating system for that manufacturer's device. I'll give you an example. Let's say Samsung wants to build in TalkBack to their particular tablets. They'll go to the IDEAL Group, and the IDEAL Group will help them implement TalkBack on their particular hardware.
They also have a number of different applications that they've also developed independent of the operating system. They have this one particular app which is their accessible app. You download the one app, and then it shows you all of the different accessibility features you can download from there.
A couple of them to make note is they have a great little magnification program for those with visual impairments. They have talking tags, so you can talk to your device. They also have a music player as well. Most of their software is free to end users. Obviously if you're a manufacturer you're paying for them to help integrate their accessibility technology into your devices. For end users, it's free. Keep an eye on them.
Last but not least, Google themselves. Google has a tremendous number of apps that they are constantly putting out. Oftentimes Google is so large that you just don't know that they put out these really great apps. In the app store, every once in a while, just go search for Google as the developer.
I came across an app, this is something called Intersection Explorer. This allows people with a visual impairment to get audio feedback on where they are at any given moment, so that they don't get lost. I thought that was a phenomenal little app.
I have not talked to users who have used this, but I think it's a phenomenal concept. It relies on Google Maps. Google Maps are incredibly accurate. It relies on TalkBack to be able to give you that feedback. What a great little tool to not get lost, even in small environments.
Look up Google, look for Google apps, accessibility apps for Google. Also as I said, Google is in general developing accessibility apps independent of the operating system, and they're installed as a service on top of the operating system. This bypasses the manufacturer, integrating it with the manufacturer, so that anyone who needs it can get direct access to it, and when it's updated it's updated immediately.
It just gives them more direct access to the end users, and generally speaking is better for the end user in terms of getting the right technology fast, in terms of accessibility. It's different than what would be the iOS platform where they do everything in major releases. Google does accessibility features as in-line releases. They just keep releasing new editions, and they do it as separate software.
Resources and closing. In terms of resources, I'm going to point you to two places. If you want to search for apps, especially special needs apps or apps that can be used with special needs users ranging in age from day one, all the way to adulthood, I would go to bridgingapps.org.
We have a very extensive Android and iOS database. All of the apps we review using speech and language pathologists, occupational therapists, or special education teachers depending on the app.
We also always review each app with a special needs user. We give you a narrative, we tell you what skills you can use to work on, we give you a range from zero to 18 against cognitive, language, auditory, visual and fine motor impairments. Between all of those sliders and the 180 skills that we track and the narrative, you're very likely to find apps that you know can be used with your user. Then you can further test with your user.
The other one I'll give you is because it's open source, there are a lot of people developing accessibility features on the Android environment, and it's not quite as consolidated as let's say the iOS environment. I'm going to give you the official accessibility guide for the Android environment. This is for developers, but a lot of the information can be used for end users as well.
I want to thank you for taking the time to spend with me, and to learn the Android features. This is just the beginning. There's a long way to go from developing your knowledge to actually using it with end users and being proficient. I want to thank you for your time and your patience. Thank you very much.
If you have any questions or have any feedback on any of these topics, I absolutely want to hear your feedback, and I want to answer whatever questions you have.
Either if you're taking this through View to Me, use the View to Me messaging system to message me. Or you can contact me at Twitter, or on my website, or any of the YouTube pages or via Google or LinkedIn, or however you want to contact me. I would love to answer any questions you have.
Or if you have feedback, I would love to hear what you have to say. I want to make this course better. Absolutely want to hear that.
Good reviews and lots of reviews help this course spread and get the information to as many people as possible. If you like the course, please review it. If you leave some comments that would be great too. If you have some comments and you have feedback, I would love to hear them. Please contact me. I absolutely want to make this course better. I am very open to whatever feedback or questions you have. Thank you.
Sami Rahman is the CEO of SmartEdTech. SmartEdTech develops software for children with disability to learn and grow. Mr. Rahman has certification in an Assistive Technology Applications Program offered by California State University and Mobile Devices for Children with Disabilities from TCEA. Mr. Rahman is the author of Getting Started: iPads for Special Needs. The book is available in print with a full version online for free.