I took the Enabling Technology class and worked in a group on the aMaze project, which provided a maze game to children with visual-impairments. A 2D maze is simulated, and the child can walk around inside the maze using a keyboard, joystick, or Sensable haptic-feedback device. Footsteps, simulated cane taps, and environmental sound sources are modelled in 3D, allowing the child to navigate by sound. There is also a visual display of the maze and current position, which can be used for debugging or disabled if a child has partial vision and wishes to try it by audio alone.
The EVE group at UNC has a bunch of foam cubes that they use to construct interesting virtual environments, and we thought of asking a few children to come in and try walking a real maze and the virtual maze. Some children would do the virtual maze first, and others would do the real maze first, and we could find out if they felt more prepared for the real maze after doing the virtual one.
The experience ended up snowballing (with the help of many) into a Maze Day with 51 visually impaired students and 55 adults visiting the UNC Computer Science department to view all of the class projects and previous research, instead of just aMaze. Maze Day has since become an annual event, and will be held again towards the end of April.
Pictures from preliminary testing
Our aMaze team had a station with 3 computers, one with a keyboard, one with the joystick, and one with the Sensable Phantom. We also made a real maze (complete with wolf and elephant sound sources played back at the bends of the maze) constructed in a room that matched one of the virtual mazes that were being shown. By navigating the maze using the computer beforehand, the general consensus was that it was much easier to navigate the real maze. Maze Day was a lot of fun, and we were proud to be a part of it.