Better wayfinding for visually impaired people: integrating haptic feedback via a smartwatch

J.H.F. van der Bie, Christina Jaschinski, Timon van Hasselt, Jan Koopman, Somaya Ben Allouch, B.J.A. Kröse

Research output: Contribution to conferenceAbstractAcademic

Abstract

Visually impaired people (VIP) can experience difficulties in navigating urban environments. They mostly depend on the environment’s infrastructure or technical solutions like smartphone apps for navigation. However apps typically use visual and audio feedback, which can be ineffective, distracting and dangerous. Haptic feedback in the form of vibrations can complement where visual and audio fall short, reducing the cognitive load.
Existing research into wayfinding using haptic feedback to better support navigation for the visually impaired often relies on custom tactile actuators and the use of multiple vibration motors. Although these solutions can be effective, they are often impractical in every day life or are stigmatizing due to their unusual appearance.
To address this issue we propose a more modular system that can be easily integrated in commercially available smartwatches. Based on existing research we present a tactile communication method utilizing the vibrotactile actuator of a smartwatch to provide VIP with wayfinding information that complements visual and audio feedback. Current smartwatches contain a single tactile actuator, but can still be used by focusing on navigation patterns. These patterns are based on research in personal orientation and mobility training with VIP. For example, a vibration pattern is used to represent a concept like ‘attention’, ‘left’ or ‘stairs’ directing the navigator’s attention towards audio or visual information or to the environment.
In next phase of this research we will conduct several focus groups and co-creation sessions with VIP and orientation and mobility experts to further specify the requirements and test our proposed tactile method. In the future, this method could be integrated in existing navigation apps using commercially available devices to complement visual and audio information and provide VIP with additional wayfinding information via haptic feedback.
This research is part of the EyeBeacons project, co-financed by the ZonMW Inzicht program
Original languageEnglish
Publication statusPublished - 25 Jun 2017
EventVision 2017: Low vision rehabilitation: a global right - World Forum, The Hague, Netherlands
Duration: 25 Jun 201729 Jun 2017
Conference number: 12
http://www.vision2017.org/

Conference

ConferenceVision 2017
CountryNetherlands
CityThe Hague
Period25/06/1729/06/17
Internet address

Fingerprint Dive into the research topics of 'Better wayfinding for visually impaired people: integrating haptic feedback via a smartwatch'. Together they form a unique fingerprint.

  • Grand EyeBeacons project

    van der Bie, Joey (Recipient), Christina Jaschinski (Recipient), A. Nait Aicha (Recipient), Somaya Ben Allouch (Recipient), Kröse, Ben (Recipient), Timon van Hasselt (Recipient), Jan Koopman (Recipient), Paul de Nooij (Recipient), David Kat (Recipient) & Thea van der Geest (Recipient), 19 Jan 2017

    PrizeAcademic

Cite this