I want to see jackets which are stylish, but not too fancy. Say, 70% stylish. This project aims to develop new techniques which can significantly improve data browsing experience in online shopping, dating, media recommendations, and many other applications. Two very common ways to explore large collections of imagery items, for instance, in online shopping, are to browse a hierarchy of items and to search with textual keywords. The returned results are browsed in lists, typically ordered by popularity. However, popularity is defined across all users as one homogeneous peoples, and users cannot sort by their own subjective criteria, e.g., by their own personal `style' for clothes; What is `stylish' to one person will be passe to another. Furthermore, there is no way to place items on a continuous scale, where the criteria amount for each item is known, e.g., how stylish a particular piece of clothing is to a user. Our goal is to develop new techniques which enable users to organize and explore imagery data based on their own subjective criteria at a high semantic level. This is a challenging problem: Many criteria are hard to quantify and a user may not even be able to articulate the criteria. We face this challenge by observing that even though users may not be able to specify their criteria quantitatively, or even fully describe them, they are still able to communicate their own notions by providing examples, this shoe is cooler than that one. Our goal is to build an algorithm that arranges a large corpus of visual data according to these examples. Once built, the arranged data can be browsed with an interface that exploits the learned criteria to navigate the continuous scale. The key contributions of the proposed research will include 1) exploring different modes of user interaction and elaborate on reflecting the resulting knowledge to 2) a new algorithm that, by breaking the limitations of existing approaches, effectively and efficiently learns from user-provided examples and thereby makes personalized data exploration realistic.
|Effective start/end date||1/09/16 → 31/05/17|