Active Visual Analytics: Assisted Data Discovery in Interactive Visualizations via Active Search

Published in arXiv preprint, 2020

Recommended citation: Shayan Monadjemi, Quan Nguyen, Henry Chai, Roman Garnett, and Alvitta Ottley. Active Visual Analytics: Assisted Data Discovery in Interactive Visualizations via Active Search. arXiv preprint arXiv:2010.08155, 2020.

Data foraging is a process commonly arising in interactive data analysis where a user sifts through a large amount of potentially irrelevant information seeking data relevant to their task. In machine learning, the related task of active search considers the automated discovery of rare, valuable items from large databases and has delivered massive speedups in discovery in areas including drug and materials discovery. However, there have yet to be any advances in integrating active search with an interactive interface to make use of an analyst’s domain knowledge in the search process. We introduce and evaluate a technique we call Active Visual Analytics (ActiveVA), an augmentation of interactive visualization with active search to accelerate data foraging. In this approach, underlying machine learning models automatically learn a user’s latent interest by observing their interactions; these models then inform an active search algorithm that leads the user toward the points judged most promising for exploration. Using the epidemic dataset from VAST Challenge 2011, we design and conduct a crowd-sourced user study to evaluate several aspects of this technique. We present evidence that a human–computer partnership based on ActiveVA results in higher throughput and more meaningful interactions during interactive visual exploration and discovery without any undue effect on the user experience.

Download paper here