If you are walking or cycling, and don’t want to (or are unable to) spend most of the time focusing on a screen, the use of mobile devices tends to be a frustrating experience. The same is true in bright sunlight or if your eyesight just isn’t good enough to see every detail on the mobile screen. The HaptiMap project (see http://www.haptimap.org) is aimed at making maps and location based services more accessible by using several senses like touch, hearing and vision.
The HaptiMap HCI-Module Demo show developers what the Human Computer Interaction (HCI) module can be used for, i.e. the illustrate in what type of application scenario they might be beneficial. The HaptiMap HCI-Module Demo app as presented here, however, shows how the HCI modules can be implemented using the HaptiMap toolkit and provides a practical example for developers. It also provides application developers’ requirements for HaptiMap toolkit developers with respect to the expected ease of integration of toolkit components.
The HCI modules illustrated by the HaptiMap app include a Geiger counter sound module, a tactile compass and a hardware interface for external devices such as the Viflex.
- Defining destinations on GPS coordinates
- Navigation by Geiger counter, a tactile compass or ViFlex (external hardware needed)