Comparing remote gesture technologies for supporting collaborative physical tasks
Kirk, D. and Stanton Fraser, D., 2006. Comparing remote gesture technologies for supporting collaborative physical tasks. In: SIGCHI Conference on Human Factors in Computing Systems (CHI 2006), 2006-04-22 - 2006-04-27.
Related documents:This repository does not currently have the full-text of this item.
You may be able to access a copy if URLs are provided below.
The design of remote gesturing technologies is an area of growing interest. Current technologies have taken differing approaches to the representation of remote gesture. It is not clear which approach has the most benefit to task performance. This study therefore compared performance in a collaborative physical (assembly) task using remote gesture systems constructed with combinations of three different gesture formats (unmediated hands only, hands and sketch and digital sketch only) and two different gesture output locations (direct projection into a worker's task space or on an external monitor). Results indicated that gesturing with an unmediated representation of the hands leads to faster performance with no loss of accuracy. Comparison of gesture output locations did not find a significant difference between projecting gestures and presenting them on external monitors. These results are discussed in relation to theories of conversational grounding and the design of technologies from a 'mixed ecologies' perspective.
|Item Type||Conference or Workshop Items (Paper)|
|Creators||Kirk, D.and Stanton Fraser, D.|
|Departments||Faculty of Humanities & Social Sciences > Psychology|
|Additional Information||SESSION: Gestures and visualizations. ISBN:1-59593-372-7|
Actions (login required)