Research

Comparing remote gesture technologies for supporting collaborative physical tasks


Reference:

Kirk, D. and Stanton Fraser, D., 2006. Comparing remote gesture technologies for supporting collaborative physical tasks. In: SIGCHI Conference on Human Factors in Computing Systems (CHI 2006), 2006-04-22 - 2006-04-27, Montréal, Québec.

Related documents:

This repository does not currently have the full-text of this item.
You may be able to access a copy if URLs are provided below.

Official URL:

http://doi.acm.org/10.1145/1124772.1124951

Abstract

The design of remote gesturing technologies is an area of growing interest. Current technologies have taken differing approaches to the representation of remote gesture. It is not clear which approach has the most benefit to task performance. This study therefore compared performance in a collaborative physical (assembly) task using remote gesture systems constructed with combinations of three different gesture formats (unmediated hands only, hands and sketch and digital sketch only) and two different gesture output locations (direct projection into a worker's task space or on an external monitor). Results indicated that gesturing with an unmediated representation of the hands leads to faster performance with no loss of accuracy. Comparison of gesture output locations did not find a significant difference between projecting gestures and presenting them on external monitors. These results are discussed in relation to theories of conversational grounding and the design of technologies from a 'mixed ecologies' perspective.

Details

Item Type Conference or Workshop Items (Paper)
CreatorsKirk, D.and Stanton Fraser, D.
DepartmentsFaculty of Humanities & Social Sciences > Psychology
RefereedYes
StatusPublished
ID Code9400
Additional InformationSESSION: Gestures and visualizations. ISBN:1-59593-372-7

Export

Actions (login required)

View Item