Abstract
In media personalisation the media provider needs to receive feedbacks from its users to adapt media contents used for interaction. At the current stage this feedback is limited to mouse clicks and keyboard entries. This report explores the possible solutions to include the gaze movements of a user as a form of feedback for media personalisation and adaptation. Features are extracted from the gaze trajectory of users while they are searching in an image database for a Target Concept(TC). These features are used to measure a user's visual attention to every image appeared on the screen called user interest level(UIL). Because the reaction of different people to the same content are different, for every new user a new adapted processing interface is developed automatically. In average our interface could detect 10% of the images belonging to the TC class with no error and it could identify 40% of them with only 20% error. We show in this paper that the gaze movement is a reliable feedback to be used for measuring one's interest to images which help to personalise image annotation and retrieval.
Original language | English |
---|---|
Title of host publication | MM'11 - Proceedings of the 2011 ACM Multimedia Conference and Co-Located Workshops - ACM Workshop on Social Behavioural Networked Media Access 2011, SBNMA'11 |
Pages | 27-32 |
Number of pages | 6 |
DOIs | |
Publication status | Published - 2011 |
Event | MM '11 ACM Multimedia Conference - Scottsdale, AZ, USA United States Duration: 28 Nov 2011 → 1 Dec 2011 |
Conference
Conference | MM '11 ACM Multimedia Conference |
---|---|
Country/Territory | USA United States |
City | Scottsdale, AZ |
Period | 28/11/11 → 1/12/11 |