CGV/About CGV

Adaptive User Interfaces Based on Visualization, > Analysis and Prediction of User's Interactions and Behaviors - Abstract

Traditionally, VisualAnalytics(VA) encompasses interactive data visualisations and analysis algorithms to uncover actionable insights in large, complex datasets. VA builds on multiple disciplines combining analytical reasoning and human cognition, decision making, visual representations, machine learning, among others. Nevertheless, many times, essential aspects are left out when designing VA systems, such as providing adequate continuous support to the users through inference of their current interests and behaviours. VA can benefit from the inference of users' interests through the capture of behavioural data using sensors, e.g., using an eye-tracker system. This benefit is especially real in safety-critical applications, e.g., air-traffic control, or in visual search tasks, e.g., comparison of data patterns. Such applications and tasks require a high attention demand which changes with time, due to effort, tiredness, boredom, or task difficulty. Analytics methods that rely solely on mouse movements, or user selections, are likely insufficient to infer the real users' intents. Currently, eye-tracking in VA mainly focuses on the evaluation of user interfaces (UIs) or the creation of offline models from pre-processed eye-tracking data. There is a gap concerning the combination of sensed information, data-features and interactions, into a comprehensive online inference model of users' interests. Our main contributions revolve around essential questions:" Whatfactorsenableadaptation and understanding of users' interests and behaviours?";" What metrics can we use to infer users' interests?";" How to model and predict users' interests through sensing and attention modelling, to provide continuous support in VA systems?". Firstly, we develop a novel multi-sensing approach based on complex event processing, it encompasses a prototype, techniques for decision investigation, sensor fusion, and online analysis of users' behavioural data at their workplace. Secondly, we propose the deployment of continuous eye-tracking based support methods and algorithms in several demonstrative VA prototypes, i.e., UI control, provenance, guidance, recommendation and adaptation, to assist the users when exploring large datasets and performing visual attention-demanding tasks. Thirdly, an extension to a well-known VA model and a comprehensive overview focused on the foundations, applications, challenges and opportunities posed by this new idea of eye-tracking based support in VA systems. We conduct users' experiments to qualitatively and quantitatively evaluate our prototypes. These experiments contribute to the development of a new typology for eye-gaze based support in VA systems. This research contributes to the advancement of the state-of-the-art in eye-tracking based support UIs. It has an immediate impact on the design of novel VA systems by including the re-structuring of UI components and features, the reverse-engineering of analytical processes when exploring large datasets, and a continuous modelling of interests. These approaches can be extended and applied to additional scenarios and domains.