A short tutorial on how to use the Android HapticGuide HCI module.

A tutotial on how to use the SCPN.

A document containing an introduction to the Android part of the toolkit.

A document with information about typical eclipse error messages (for Android developers).

Download the Haptimap toolkit release (version 1.01b1) here... 

Download the Haptimap toolkit release (version 1.0b2) here... 

Download the HaptiMap toolkit release (version 1.0b3) here....

Toolkit release 1.0b3 contains some bug fixes to address problems with the OpenStreetMap (OSM) geodata plug-in and some minor reorganisation of the Android example Apps. The OSM data for the toolkit is now delivered from a dedicated mirror server that is currently populated with data from Sweden, G.B. and Ireland. A full European dataset will be available by 1st January 2012.

Download the HaptiMap toolkit release (version 1.0b4) here...

Toolkit release 1.0b4 (Feb 24th 2012) Includes a reorganised 'Examples' folder for the Android Crust modules that makes it easier to find the module that you are looking for.  There are also several bug fixes and new libraries that refer to/use our project specific OpenStreetMap server.

 

Download the latest HaptiMap toolkit release (version 1.0b5) here....

Release 1.0b5 includes updated Android examples, bug fixes and refreshed build files.

 

SCPN (safe corridor pedestrian navigation) module 

gren line showing a person walking inside a safe corridor

Intermodal routing and pedestrian navigation is becoming an increasingly sought-after functional requirement for many mass-market mobile apps. For vehicle drivers a variety of map-matching methods and corresponding APIs exist that can be applied to increase the perceived accuracy of the user's current position on a map. For pedestrian navigation, however, where it is essential to provide accessible guidance of users in a safe corridor, no such APIs exist. The recently released Safe Corridor Pedestrian Navigation (SCPN) module of the HaptiMap toolkit provides a variety of interaction and geographical methods and components that enable the extraction of an SCPN-API to support the development of various pedestrian navigation apps.

 

 

The Joined API. The Joined AIP (a light-weight part of the Haptimap toolkit published separately) is an API for localization of people and places, based on the Joined application that was developed within the HaptiMap project. Joined helps you to locate your friends outdoors at open-air festivals, crowded mass events and other strange environments. Joined not only shows the location of your friends on a map, it also provides the direction and distance to your friends by sound and vibration.

The publicly available Joined API enables users to build their own innovative applications based on the Joined infrastructure and the following functionality:

  • register and login as a new user
  • search and contact friends
  • show the location of friends
  • chat with friends
  • take a bearing of a friend

The Joined API homepage can be found here:

http://joined.geomobile.de/developer/ 

 

Questions/problems? We have a help support system for Toolkit queries at https://haptimap.ee.qub.ac.uk/dev/newticket

 

 

Toolkit Advantages

The toolkit offers five unique features and advantages that are not available in any other source code library (either on mobile or desktop platforms).  These are fully outlined in the following sections.

  

Access to Geographic Data in Vector format

The ability to interact with map data haptically or audibly requires that the data is available in a form that does not rely on pre-rendered image tiles. This is the primary advantage of the toolkit in contrast to those applications that are purely mono-modal (i.e. visual). For example the data representing hiking trails in a national park must be available in a vector format and differentiable from the other features. This ability to differentiate between map data features (not just using their visual appearance) is essential for multi-modal applications. By acquiring map data in this way and providing HCI functions in the toolkit to represent it through non-visual (as well as visual) interfaces, the toolkit offers a unique multimodal architecture. Over slower wireless networks the transfer of vector data is also considerably faster and local caches can hold more information making the user experience much more responsive. Figure 1 illustrates how vector data can be rendered on mobile devices with greater flexibility than using raster data. Vector data can also be combined with other data, for example height data from different sources. 

 iPhonemap with 3D landscape and roads 

 

Figure 1: Vector data

 

Infrastructure for Acquiring Geographic Data from Multiple Sources


A typical HaptiMap application requires the use of geographical data from multiple sources at any one time.  For example:

  • Details of topographic features such as roads, paths, fences and natural features may come from a server provided by a national mapping agency.
  • Details of recommended hiking trails may be acquired as a linestring representing a suggested route obtained from an Internet route planning service.
  • Information on local points of interest could be retrieved from an additional data source provided by a local tourism agency or be self-generated. 

The toolkit provides an architecture consisting of multiple geographic data plug-ins that can connect to different data sources and retrieve data from them.  It can then merge everything together into a common co-ordinate system and storage format for ease of access by toolkit users.

All the complexities of working with geographical data (e.g. different co-ordinate systems, logical structures, file/transfer formats, attribute models etc.) are dealt with by the plug-ins and are hidden from the user. As such, the user of the toolkit only deals with a simple integer co-ordinate system, which they can treat as a conventional three-dimensional Cartesian system, with simple and intuitive geometric operations for calculating distances, direction etc.

 

Dynamic Contextualised Map Rendering


A visual map is sometimes necessary and as such it is useful to be able to augment a non-visual display for sighted users. It can also be useful to partially-sighted people, but the default appearance of pre-rendered tiles may not be suitable for these users.  For example, they may need higher contrast or larger font sizes for labels (e.g. street names) shown on the map.

Different maps

Figure 2: A map drawn using tiles and a custom map rendered using contextualised data derived from raw data.  Maps rendered from raw data offer an easy way to provide multimodal responses such as reading street names and rendering hapic feedback.

 

Since the Haptimap toolkit stores geographic information as vector data rather than pre-rendered tiles, the fast graphical processors in modern smart-phones can be used to:

  • Dynamically render the visual appearance of the map to accentuate features.
  • Or only display features of interest.
  • Or to provide contextualised rendering based on the needs of the individual user.

The toolkit fully supports this and provides sample rendering styles that can be customised based on the needs of particular groups of users.

This is illustrated in Figure 2 where a toolkit rendering of a map using tens of bytes of data is compared with a raster ‘tile’ rendered map from OpenStreetMap.  The OSM rendering requires a much higher bandwidth for data transmission and its image cannot be easily contextualised (e.g. pointing at a road and having its name read out). 

 

Cross platform API for mobile and desktop platforms


The market for mobile devices is quite fragmented and the toolkit’s utility would be limited were it to be targeted towards only one of these platforms. Thus the toolkit has been developed, as far as possible, to support a wide range of platforms.

Supported mobile platforms include: 

Android
iPhone OS 3.x & 4.x
Windows Mobile 6.x
Symbian 3rd and 5th editions
Linux phones (e.g. OpenMoko and Maemo 4 & 5)
 

Supported desktop platforms comprise: 

Windows (XP, Vista and 7)
Linux
Mac OS X (Leopard 10.5 and Snow Leopard 10.6)


The toolkit core and mantle compile and run on all these platforms. An important aspect of cross-platform support that the toolkit provides is support for the specialist haptic hardware often used for multi-modal output. Hardware developed by HaptiMap partners uses the Bluetooth standard for wireless communications and the toolkit supports access to this on all platforms using an identical API. Access to other platform-specific functionality such as graphics output is also unified and simplified by the toolkit.  For example, 3D graphical rendering can be done in an analogous way to that used in the GLUT for desktop platforms; The toolkit provides the functionality of setting up the drawing surface and presenting it to the user. The user then has only to use standard OpenGL drawing commands in a callback function.


Pluggable HCI modules for specific platforms


The toolkit provides a large amount of useful functionality across all supported platforms, enabling really useful applications to be built based on it. However, it is important to realise that the target audience for the toolkit is software developers.  As such, development knowledge and experience is certainly required in order to maximise the potential of the toolkit.


In order to ease the learning curve and allow even inexperienced developers to produce useful toolkit-based applications quickly, the toolkit also includes a large number of pluggable HCI modules for the different platforms. As the name suggests, these can be easily “plugged” into existing applications to add advanced multi-modal human-computer interface functionality quickly and easily.

 

Toolkit Architecture


As illustrated in Figure 3, the toolkit is divided into three layers with each layer using features in the layer below it. The Core and Mantle layers, together with the geographic data plug-ins, form the toolkit library. Code modules, example programs and pluggable task components form the highest layer of the toolkit.  This is known as the Crust and they will be used by developers as templates on which to build applications.

HaptiMap toolkit structure

Figure 3: The toolkit layered structure illustrating the components of the three layers.

 

More details on the components of the core, mantle and crust are described in Deliverable 4.3.  Information is also provided regarding the geographical data plugins. 

 

Core
The toolkit core provides low-level functionality to the rest of the toolkit. This primarily includes:

  • Platform-specific abstractions, so that functionality that operates differently on different operating systems can be accessed using a single API across all supported toolkit platforms.
  • Hardware drivers, for certain specific hardware devices that the toolkit includes native support for, e.g. hardware devices developed by HaptiMap partners.
  • Geographical data storage—an efficient and optimised implementation of the vector data storage required by toolkit applications.

The core is licensed under the LGPL and is written primarily in ANSI C, but includes small amounts of platform-specific code where necessary to implement platform-specific abstractions (e.g. Java for Android or Objective C for iPhone).

 

Mantle
The toolkit mantle includes functionality that builds on the abstracted interface provided by the core. Mantle modules provide “building blocks” that can be assembled to create user interface components. In general, they either interact with a user via the core’s HCI functions to deliver map information (e.g. a set of directions through vibration patterns), or use the sensor data from the core’s utilities to provide some useful information (e.g. a bearing or a context). The modules within the mantle are all platform independent, written only in ANSI C and licensed under the LGPL.

 

Geographical Data Plugins
The toolkit geographical data plugins are software modules  that are logically separate from the toolkit core that abstract the complexities of working with geographical data (i.e. different co-ordinate systems, logical structures, file/transfer formats, attribute models etc.) from the toolkit core.  This is necessary because the core has been developed to work with integer Cartesian co-ordinates. It is necessary to write a plugin for each type of data source that has to be supported by the toolkit. These are written in C as far as possible, but may include some platform-specific code if necessary.  At present they are linked into the main toolkit library object and so must also be licensed under the LGPL. The logical separation from the toolkit core means that they could be implemented as separate shared libraries, should licensing considerations make this necessary.

 

Crust / Applications Layer
The toolkit crust consists of user interface components and advanced functionality built on the lower layers of the toolkit. It contains sample programs and code for platform specific applications, e.g. haptic surface map height rendering on desktop applications. Much of the crust is platform-specific and may be written in any appropriate language. Code in the crust is not compiled into the toolkit library object, and so may be placed under any license.

 

Implementation and Code Design
For detailed information on the architecture, implementation and code design of the toolkit, please see Chapters 2 and 3 of Deliverable 4.2.

This page provides a short summary of the steps a potential user of the toolkit should be familiar with before commencing full-scale development. We explain how to build the toolkit from source.  For those wishing to avoid building the toolkit from source, pre-built libraries are available for the most popular development tools on Windows, Android and IOS 3 and 4.

Once the toolkit has been obtained from the HaptiMap website, it can be integrated in projects for all major mobile platforms (IOS, Android, Windows Mobile) and common desktop operation systems (Linux, Windows, Mac OS X). For each individual platform the HaptiMap website provides information about how the toolkit can be built and integrated into an existing or new application. Sections 2.3 to 2.6 of this chapter provide some basic examples of toolkit usage on the most popular platforms.  Potential users of the toolkit can use these examples for experimentation in order to familiarise themselves with the toolkit.

 

Building the Toolkit

The following sections outline the processes by which to obtain and build the toolkit on the most popular platforms for both mobile and desktop applications. 

  

Android

This section outlines the necessary steps in order to build the toolkit for the Android mobile platform using the widely used Eclipse Integrated Development Environment (IDE). There are two options: either you can choose to only use the crust (that is, Android-specific, HCI modules that do not require you to build the ndk environment), or the core or mantle parts (cross-platform HCI-modules and sensor interfaces).

 

Using only Android-specific modules
The HaptiMap toolkit is designed as a library project and can be integrated into an application from the IDE. To accomplish this, you first need to import the HaptiMap toolkit project into Eclipse (File->Import, Existing projects into workspace). You find the project in the HaptiMapToolkit folder (in “Your path”\HaptimapToolkit\crust\android\) that is placed alongside some folders of example projects. Once it has been imported, all non-native Android-specific HCI modules will be compiled automatically, and can be used according to their individual documentation.


To integrate the toolkit into your own project, use the principle of Android external libraries: right-click on your project, choose Properties, then Android in the left-hand tree list, click Add in the Library section of the window and choose HaptiMapToolkit. See also Figure 4.


Using cross-platform modules
If the developer is further interested in using the HaptiMap toolkit core or native cross-platform HCI modules, the native parts of the toolkit need to be compiled as well. As part of its normal functionality, Android provides a developer with the "ndk-build" command.  This command can be used to compile the native parts of the toolkit instead of setting up Android toolkit specific build scripts as illustrated in Figure 5. Once the command has been called, the shared objects containing the core and mantle functionalities will be built and integrated into the "HaptiMapToolkit" library project and will be shipped together with the non-native code.   JNI (Java Native Interface) wrappers are provided for the developers’ convenience and come with the corresponding Javadoc documentation, which a Java developer would be well accomplished at using. 

 

Android development environment

Figure 4: The HaptiMap Toolkit can be integrated into Android applications via the Android External Libraries principle.

 

ndk build screen

Figure 5: A single call of "ndk-build" builds the native parts of the toolkit.

 

Windows

On the Windows platform, the Haptimap toolkit consists of a static library constituting the core functions and tightly integrated mantle modules.  It is also possible to build a number of other libraries for less commonly used mantle modules. Executable programs that demonstrate crust components are built by linking with the core library.

Most problems people encounter when building applications for Windows occur because of incompatibilities among the libraries they need to use.  The Haptimap toolkit on Windows makes use of a large number of external libraries and thus it has been necessary to provide a number of different build files.  Most of the libraries used within the toolkit are freely available open source suites and can be obtained from the appropriate web site. The only exception being the Navteq MapTP library for which the toolkit has a map data plug-in.  For those users of the toolkit we provide build files that do not require the use of the MapTP libraries. 

The primary tool used to build the toolkit on Windows is Microsoft’s Visual Studio 2008.  Included within the toolkit is a solution file that contains projects to build the toolkit for 32/64 bit platforms, with or without MapTP support, with or without Bluetooth support and with or without desktop haptics.  Figure 6 illustrates this project layout and the special ‘config.h’ file that is used to define the parts of the main toolkit library that are included in the core library.

The Visual Studio project file is automatically upgraded for use in Visual Studio 2010.  For mantle modules not included in the main library, separate SLN and VCPROJ files are included in the appropriate folders.

 

Windows build environment

Figure 6: Windows build environment.

 

iPhone

For the iPhone platform all development is implemented using Xcode.  In the iPhone build folder, project files for Xcode 3 /4 will allow static toolkit libraries for the emulator and the device itself. These libraries take the same name but are built into different folders. When building an App with these libraries, both the emulator and device projects must link against the appropriate library. 

The libraries can be built by simply “double clicking” on the ‘.xcodeproj’ bundle file, and pressing the build button on the toolbar.  Very little additional configuration should be required. When developing Apps it will be necessary to have a developer profile installed on the target device.  Figure 7 illustrates an example of the toolkit Xcode development build.

 

XCODE build environment

Figure 7: XCODE development build of Toolkit.

 

Symbian

The Symbian version of the toolkit is built using the Carbide IDE on the Windows platform.  A library project for carbide and MMP project files are provided in the Symbian build folder.  Examples are also built in the same way from Eclipse projects that link with the toolkit library.  Figure 8 illustrates the Carbide development environment showing the MMP configuration file for the toolkit library project. This MMP file can be driven to build libraries for the Symbian emulator and Symbian devices.

 

Symbian build environment

Figure 8: Symbian build environment.

 

Toolkit on-line Resources


The actual toolkit is accompanied by a common open source project infrastructure. To encourage use of the toolkit and subsequent development a SVN centralised version control system, aTrac online project management software with wiki functionality, and a forum to discuss toolkit-related things were all set up. 

The source of the toolkit and other essential file resources are provided via SVN (Subversion), an established server-based version management system. This allows internal and external developers of the library to make use of all the diverse versioning functionalities.  For example, it is possible to instantly branch the current version of the toolkit (i.e. trunk) in order to integrate and test functionality.

It was decided to use Trac because as one of the state of the art project management web-frameworks, it integrates seamlessly with the SVN management. Using Trac it is possible to follow any recent code changes, find and resolve bugs, manage responsibility, and have discussions in a Wiki-like manner.  Figure 9 illustrates Trac in action.  With this tight integration, the actual documentation can refer to certain classes, functions or fields in the code.  It is believed that this integration will help external developers to quickly understand and adapt the essentials of the HaptiMap toolkit.

 

 trac screen shot trac screen shot

Figure 9: Two screenshots of the Trac project management software.

 

 The online discussion (see Figure 10) forum was set up primarily to support the less experienced developers, who perhaps require only a quick answer to a specific question and don't want to read all the documentation. In comparison to a wiki, a forum invites the users to discuss various things, even if they are vague or incomplete. From other projects, it has also been shown that a forum is also often used to raise ideas for future developments. This helps to maintain and continue the development of the toolkit according to what toolkit users are expecting.

 

Haptimap forum

Figure 10: A screenshot of the HaptiMap forum.

 

A starter application for Android


To allow external developers to quickly understand the HaptiMap design patterns, the toolkit comes complete with a sample application tutorial.  For the sample application, we envision that a developer wants to highlight an arbitrary location visually as well as non-visually. This application extends the Google "Hello Map" tutorial, where a developer learns to set up a visual map and set visual markers for a destination. Our HaptiMap extension tutorial shows how this destination can be displayed in two non-visual ways. Firstly, it is shown how the destination can be displayed through the Tactile Compass vibration patterns. Secondly, it is shown how the destination can also be highlighted with audio feedback in a Geiger counter style. Therefore this demo application showcases two of the toolkit’s most tested and highly relevant HCI modules. Since other HCI modules work in almost similar ways, the aim is that the developer will quickly become accustomed to the HaptiMap design patterns and can further extend the starter application.

The starter example is found at: https://haptimap.ee.qub.ac.uk/dev/wiki/UsingToolkitonAndroid

 

A starter application for Windows

A simple example illustrating the use of the toolkit on the Windows platform is one in which the toolkit library is used to prime the internal cache with some mapping data from the OpenStreetMap map server

To construct the application begin by copying the template “starter.vcproj” to a different folder and rename it to a new project.  Now double click on it.  Visual Studio 2008 will open and invite you to create a SLN container for the project; allow it.   You are now in a position to build your project by adding the source files.

Using this project file as a template allows the user to set up all the include paths and library references they will require to build a Haptimap toolkit application.

 The project setup will take the appearance of the screen shot shown in Figure 11.

 

Windows starter application

Figure 11: An empty VCPROJ file ready to be used to build a Haptimap toolkit project.

The next stage is to include the source file.  To do this add the file DATAIN.C  to the source files and inspect the include paths in the settings dialog as shown in Figure 12.  Note carefully the include paths as it might be necessary to adjust some of these to match the preferred build environment.   It is the settings of these include and library paths that we find cause most build problems so it is imperative that the user of the system checks that they have set them to match their own system.

The second most common cause of problems on Windows is that some of the 3rd party libraries were built using different calling conventions (i.e. DLL versus static libraries or using different library versions.)  This can be overcome by re-building the 3rd party libraries – one of the main reasons why it is desirable to only use open source libraries in the toolkit.

 

Visual studio project setup

Figure 12: The Visual Studio 2008 project after inclusion of the source file.

 

Once these steps are complete, the project is ready to be built and executed.  Build it and run it. Once this starter application is running then ANY other toolkit application chould be readily built using the same starting template as has just been described.

 

A starter application for the iPhone

In this example a simple application is created that adds a linestring into the toolkit internal storage. A button will be set up on the iPhone screen that writes an arbitrary polygon linestring into the storage and updates a label on the display saying how many linestrings are in storage. Note that on the iPhone/iPod/iPad that each application has its own sandbox storage which is unique to that application and so any data downloaded by this application, while it persists between invocations, cannot be accessed by any other application.

To get the applications up and running it is necessary to copy the files in the starter application folder to a new folder. This is done so that all the required libraries and frameworks used by the Haptimap toolkit are available to the user of it.  The Project Edit Active Target command can be used to change the Project and Product names to something meaningful to the user.

This starter application contains all the code to test the toolkit and the program can be built and run on the emulator. If the program is run via the Xcode console then toolkit messages will appear on the lag when “Write Linestring” button is clicked. (See Figure 13.)

XCODE project setup

Figure 13:  The Xcode Consle and iPhone emulator for the Starter project.

 

All the coding details are basically the same as those a toolkit developer would be familiar with from most iPhone projects. The Write Linestring button has a handler function in the ViewController derived class. The functions that use the toolkit are all contained in a separate “C” file that is added to the Xcode project. The file Test.h contains the include directives for the toolkit and the file test.c contains the code for the functions themselves.

The code for the button handler is self explanatory; it also includes some calls to the message logs and a call to the external “testMemoryMappedFiles() which handles the toolkit connection.

 

Other starter applications


Each supported platform has a similar starter application that can be used for priming a new development thread. All of these are contained in the Crust folder under the appropriate platform header.

 

Deliverables


Much more detail can be obtained by reading the project deliverables that explain the design of the tooolkit and expand this page:

Deliverable 4.3

Deliverable 4.2

 

Full details and up-to-date information information is available via these links:


The latter two are currently password-protected (restricted to the HaptiMap consortium).

Welcome to the HaptiMap toolkit primary download page. Here you find the lastest Toolkit release:

 

Download the toolkit (requires registration). The toolkit is licensed under the GNU Lesser General Public License (LGPL). Questions/problems? We have a help support system for Toolkit queries at https://haptimap.ee.qub.ac.uk/dev/newticket

 

Full details and up-to-date information information is available via these links:


The latter two are currently password-protected (restricted to the HaptiMap consortium).

 

Introduction to the HaptiMap toolkit

 

Getting started with the toolkit

 

Videos showing some examples of what you can do using the toolkit HCI modules:

 

More detailed information about the toolkit can be found in the following two documents:

HaptiMap D4.3 - a document which contains both technical overview and useful getting started information

HaptiMap D4.2 - a document explaining the basic structure of the toolkit