IEP Process : ECC Games for Visually Impaired Students
ObjectiveEd.com is our new organization where we are building ECC interactive simulations for blind and low vision students, based on each student’s IEP .
The student’s progress in acquiring skills in these curriculum-based games will be maintained in a private secure cloud, available to the school IEP team in a web-based console .
If you are a TVI , click for additional details on learning about these types of games as part of maximizing student outcomes, relating to their
IEP Process .
Controlling the game
In this week’s class, I decided to dive into what happens when the gamer player first starts the app. We talked about Angry Birds as the model for a good game, and tried to do what they did.
Angry Birds starts as with a short video, so we decided to do a short audio tutorial. We would tackle what the tutorial would say in a future class.
One child asked if she could control the game by talking to the iPad, the same way she talks to SIRI. It won’t work. Not only does Apple prohibit app developers from accessing SIRI’s features, but often SIRI (and other speech recognition systems) don’t understand everything you say. Trying to use SIRI to play again would be a very frustrating.
First we would need a way for someone to pause the game by making some gesture on the iPad – tap the screen, swipe the screen – and be told how to operate the game. We made a list of all possible gestures that the iPad allows, and talked about all of the actions a game player would want to do:
Each student would come up to the whiteboard, explain the gesture and draw out exactly what the game player would see. By repeating this for every gesture, the students began to understand the limitations of gestures, which gestures were easy and hard, and what the game playing activity would feel like. We rejected a lot of gestures that were ridiculous (such as tap 5 times and swipe to restart the game).
Some gestures would cause really big things to happen – such as restarting the game – and in the visual world, the game would confirm that’s the action you really meant; letting you CANCEL if you accidentally tapped RESTART. To emulate this on a black screen, some of the gestures would ask if that’s what you really wanted to do. We studied two alternatives:
- repeating that gesture for YES, and some other action for CANCEL
- one action to always mean “confirm yes” (such as tapping the screen once with one finger) and one action to always mean “cancel” (such as tapping with 3 fingers once)
We created the concept of an audio menu, analogous to a screen menu, where the options would be read to you (restart game, change settings, start tutorial) along with the gestures that would invoke that menu option.
They tried on these gestures on their iPhone, pretending the game was running, and after some “real-world” testing and revising, we had a pretty good set of gestures to control the game.
One of the children asked an obvious question: “What if the speaker is off?”. We would solve that by showing the words on the screen “Use your headphones to play this game”, and we would say that when the game started. If the game player is blind, they would always have their speakers on. If the game player is sighted, they would read those words, and plug in their headphones.