-
Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems • 2025Conference Paper
doi Datamancer: Bimanual Gesture Interaction in Multi-Display Ubiquitous Analytics Environments ↗
Click to read abstract
We introduce Datamancer, a wearable device enabling bimanual gesture interaction across multi-display ubiquitous analytics environments. Datamancer addresses the gap in gesture-based interaction within data visualization settings, where current methods are often constrained by limited interaction spaces or the need for installing bulky tracking setups. Datamancer integrates a finger-mounted pinhole camera and a chest-mounted gesture sensor, allowing seamless selection and manipulation of visualizations on distributed displays. By pointing to a display, users can acquire the display and engage in various interactions, such as panning, zooming, and selection, using both hands. Our contributions include (1) an investigation of the design space of gestural interaction for physical ubiquitous analytics environments; (2) a prototype implementation of the Datamancer system that realizes this model; and (3) an evaluation of the prototype through demonstration of application scenarios, an expert review, and a user study.
-
Proceedings of the 2024 CHI Conference on Human Factors in Computing Systems • 2024Conference Paper
doi VisTorch: Interacting with Situated Visualizations using Handheld Projectors ↗
Click to read abstract
Spatial data is best analyzed in situ, but existing mixed reality technologies can be bulky, expensive, or unsuitable for collaboration. We present VisTorch: a handheld device for projected situated analytics consisting of a pico-projector, a multi-spectrum camera, and a touch surface. VisTorch enables viewing charts situated in physical space by simply pointing the device at a surface to reveal visualizations in that location. We evaluated the approach using both a user study and an expert review. In the former, we asked 20 participants to first organize charts in space and then refer to these charts to answer questions. We observed three spatial and one temporal pattern in participant analyses. In the latter, four experts---a museum designer, a statistical software developer, a theater stage designer, and an environmental educator---utilized VisTorch to derive practical usage scenarios. Results from our study showcase the utility of situated visualizations for memory and recall.
-
pdf Sensemaking Sans Power: Interactive Data Visualization Using Color-Changing Ink ↗
Click to read abstract
We present an approach for interactively visualizing data using color-changing inks without the need for electronic displays or computers. Color-changing inks are a family of physical inks that change their color characteristics in response to an external stimulus such as heat, UV light, water, and pressure. Visualizations created using color-changing inks can embed interactivity in printed material without external computational media. In this paper, we survey current color-changing ink technology and then use these findings to derive a framework for how it can be used to construct interactive data representations. We also enumerate the interaction techniques possible using this technology. We then show some examples of how to use color-changing ink to create interactive visualizations on paper. While obviously limited in scope to situations where no power or computing is present, or as a complement to digital displays, our findings can be employed for paper, data physicalization, and embedded visualizations.
-
pdf Towards Understanding Sensory Substitution for Accessible Visualization: An Interview Study ↗
Biswaksen PatnaikClick to read abstract
For all its potential in supporting data analysis, particularly in exploratory situations, visualization also creates barriers: accessibility for blind and visually impaired individuals. Regardless of how effective a visualization is, providing equal access for blind users requires a paradigm shift for the visualization research community. To enact such a shift, it is not sufficient to treat visualization accessibility as merely another technical problem to overcome. Instead, supporting the millions of blind and visually impaired users around the world who have equally valid needs for data analysis as sighted individuals requires a respectful, equitable, and holistic approach that includes all users from the onset. In this paper, we draw on accessibility research methodologies to make inroads towards such an approach. We first identify the people who have specific insight into how blind people perceive the world: orientation and mobility (O&M) experts, who are instructors that teach blind individuals how to navigate the physical world using non-visual senses. We interview 10 O&M experts---all of them blind---to understand how best to use sensory substitution other than the visual sense for conveying spatial layouts. Finally, we investigate our qualitative findings using thematic analysis. While blind people in general tend to use both sound and touch to understand their surroundings, we focused on auditory affordances and how they can be used to make data visualizations accessible---using sonification and auralization. However, our experts recommended supporting a combination of senses---sound and touch---to make charts accessible as blind individuals may be more familiar with exploring tactile charts. We report results on both sound and touch affordances, and conclude by discussing implications for accessible visualization for blind individuals.
-
pdf Scents and Sensibility: Evaluating Information Olfactation ↗
Click to read abstract
Olfaction---the sense of smell---is one of the least explored of the human senses for conveying abstract information. In this paper, we conduct a comprehensive perceptual experiment on information olfactation: the use of olfactory and crossmodal sensory marks and channels to convey data. More specifically, following the example from graphical perception studies, we design an experiment that studies the perceptual accuracy of four cross-modal sensory channels---scent type, scent intensity, airflow, and temperature---for conveying three different types of data---nominal, ordinal, and quantitative. We also present details of a 24-scent multi-sensory display
-
pdf Information Olfactation: Harnessing Scent to Convey Data ↗
Click to read abstract
Olfactory feedback for analytical tasks is a virtually unexplored area in spite of the advantages it offers for information recall, feature identification, and location detection. Here we introduce the concept of information olfactation as the fragrant sibling of information visualization, and discuss how scent can be used to convey data. Building on a review of the human olfactory system and mirroring common visualization practice, we propose olfactory marks, the substrate in which they exist, and their olfactory channels that are available to designers. To exemplify this idea, we present VISCENT: A six-scent stereo olfactory display capable of conveying olfactory glyphs of varying temperature and direction, as well as a corresponding software system that integrates the display with a traditional visualization display. Finally, we present three applications that make use of the viScent system: A 2D graph visualization, a 2D line and point chart, and an immersive analytics graph visualization in 3D virtual reality. We close the paper with a review of possible extensions of viScent and applications of information olfactation for general