Research

Gaze movement inference for user adapted image annotation and retrieval


Reference:

Mirza, S.N.H., Izquierdo, E. and Proulx, M., 2011. Gaze movement inference for user adapted image annotation and retrieval. In: MM'11 - Proceedings of the 2011 ACM Multimedia Conference and Co-Located Workshops - ACM Workshop on Social Behavioural Networked Media Access 2011, SBNMA'11. , pp. 27-32.

Related documents:

This repository does not currently have the full-text of this item.
You may be able to access a copy if URLs are provided below.

Official URL:

http://dx.doi.org/10.1145/2072627.2072636

Abstract

In media personalisation the media provider needs to receive feedbacks from its users to adapt media contents used for interaction. At the current stage this feedback is limited to mouse clicks and keyboard entries. This report explores the possible solutions to include the gaze movements of a user as a form of feedback for media personalisation and adaptation. Features are extracted from the gaze trajectory of users while they are searching in an image database for a Target Concept(TC). These features are used to measure a user's visual attention to every image appeared on the screen called user interest level(UIL). Because the reaction of different people to the same content are different, for every new user a new adapted processing interface is developed automatically. In average our interface could detect 10% of the images belonging to the TC class with no error and it could identify 40% of them with only 20% error. We show in this paper that the gaze movement is a reliable feedback to be used for measuring one's interest to images which help to personalise image annotation and retrieval.

Details

Item Type Book Sections
CreatorsMirza, S.N.H., Izquierdo, E. and Proulx, M.
DOI10.1145/2072627.2072636
DepartmentsFaculty of Humanities & Social Sciences > Psychology
RefereedNo
StatusPublished
ID Code31570

Export

Actions (login required)

View Item