|Danilo Gallo, Shreepriya Shreepriya, Jutta Willamowski|
|Computer Human Interaction conference (CHI), Honolulu, Hawaii, USA, 25 April-1 May, 2020|
Navigation systems for runners commonly provide turn-byturn directions via voice and/or map-based visualizations. While voice directions require permanent attention, mapbased guidance requires regular consultation. Both disrupt the running activity. To address this, we designed RunAhead, a navigation system using head scanning to query for navigation feedback, and we explored its suitability for runners in an outdoor experiment. In our design, we rovide the runner with simple and intuitive navigation feedback on the path s/he is looking at through three different feedback modes: haptic, music and audio cues. In our experiment, we compare the resulting three versions of RunAhead with a baseline voice-based navigation system. We find that demand and error are equivalent across all four conditions. However, the head scanning based haptic and music conditions are preferred over the baseline and these preferences are impacted by runners’ habits. With this study we contribute insights for designing navigation support for runners.