top of page
Natural Eye-based Gaze Interaction

Natural Eye-based Gaze Interaction

Shared Physiological Cues

Shared Physiological Cues

G-SIAR

G-SIAR

KITE-Mobile AR

KITE-Mobile AR

Gestures Library

Gestures Library

User-defined Gestures for AR

User-defined Gestures for AR

Physically-based Interaction in AR

Physically-based Interaction in AR

Surface and Motion User-Defined Gestures for Mobile AR

We investigate two gesture modalities, surface and motion, for operating mobile AR applications. In order to identify optimal gestures for various interactions, we conducted an elicitation study with 21 participants for 12 tasks, which yielded a total of 504 gestures. We classified and illustrated the two sets of gestures, and compared them in terms of goodness, ease of use, and engagement. From the interaction patterns of this second set of gestures, we propose a new interaction class called TMR (Touch-Move-Release), which defines for mobile AR.

Dong Z, Piumsomboon T, Zhang J, Clark A, Bai H, Lindeman R. A Comparison of Surface and Motion User-Defined Gestures for Mobile Augmented Reality. InExtended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems 2020 Apr 25 (pp. 1-8).

Research Article 5
bottom of page