Maps and location based services are used in a whole range of different contexts. Whenever a person wants to do something, the action takes place in the current context. People rely on their abilities and experiences to handle different situations in the here and now. Mobile use contexts include situations where it is difficult to look at or attend to the screen, noisy environments like busy streets or railway stations and environments with external vibrations like public transport. To work in these kinds of environments an application cannot rely solely on screen based information – the visual design needs to be complemented by interaction using sounds, gestures and touch.

 

The HaptiMap project has performed basic research on how to design multimodal map and navigation interfaces, and results from this research has been tested in a set of applications developed within the project. To help other developers gain access to our results, we have encapsulated working designs as modules in the HaptiMap toolkit – a software package that is intended to make it easier for developers to include multimodal interaction in their applications – or to add more multimodality to existing products.

 

It provides an adaptable toolkit to aid developers in including accessibility functionality into cross-platform, multi-modal mobile applications which can avail of a range of sensors, display and audio characteristics or externally attached devices.  This is compatible with the major mobile platforms; Android, iOs, Windows and Symbian.  The toolkit offers different ways of accessibly perceiving geographic data while presenting an infrastructure for acquiring such data from multiple sources.

 

Over 23 HCI (human-computer interface) modules are provided that developers can ‘plug and play’ into applications.  These are categorised by,

  • Crust (HCI) modules that comprise user interface components and advanced functionality built on the lower layers of the toolkit, such as the virtual observer or haptic guide.
  • Mantle modules that provide the building blocks to create user interface components, such as the tactile compass or 3D map feature visualisation using OpenGL
  • Core modules which provide low-level general functionality such as sensor data acquisition or geographical data coordinate transformations.

 

The HaptiMap toolkit consists of a cross-platform software library with a C API (comprising of the core, mantle and a set of geographical data plugins), together with a set of example programs and modular HCI components (the crust).  These illustrate how the toolkit is capable of being used on different mobile platform.

 

The toolkit library is licensed under the GNU Lesser General Public License, while the crust components are licensed more flexibly, often enabling developers to incorporate them in their own applications.

The toolkit provides a simple cross-platform API that simplifies the complexities of

  • Dealing with haptic / audio / visual input and output on a cross-platform basis,
  • Retrieving, storing and manipulating geographic data,

behind a simple interface, leaving user interface developers free to concentrate on maximising the usability and accessibility of their applications.

 

Download the Toolkit and use it to enhance your apps.

 

The HaptiMap toolkit is supported by design tools developed within the project. In the design tools section you find guidelines and checklists, but also the HaptiMap context cards – a deck of cards designed to bring the mobile context into design and development.  These increasingly popular cards have been found to work well for ideation (brainstorms & discussions) but can also be used for rapid evaluation or communication within the team.