Alex Kale

PhD candidate in Information Science at the University of Washington, advised by Jessica Hullman.
Currently, a visiting scholar at Northwestern University
Uncertainty visualization, data cognition, HCI.


Midwest Uncertainty Collective
Interactive Data Lab
Vision-Cognition Lab


Masters of Science in Information Science, UW
Bachelors of Science in Psychology, UW


My research combines graphical perception experiments and systems design to investigate uncertainty communication. I am interested in how data visualizations are used to communicate probability and uncertainty information to non-expert audiences, especially when people rely on that information to make incentivised decisions. I am also interested in how such visualizations are incorportated into interactive systems for data analysis and decision support. As a case study on uncertainty visualization in analysis systems, I am designing software with Stottler Henke and Associates to help scientists conduct meta-analysis and communicate evidence-based recommendations to decision-making officials and other stakeholders.

Here are some representative publications (see my CV for a full list).

Visual Reasoning Strategies for Effect Size Judgments and Decisions
VIS 2020, InfoVis Best Paper Award 🏆
Alex Kale, Matthew Kay, and Jessica Hullman

We present a mixed design experiment on Mechanical Turk which tests eight uncertainty visualization designs: intervals, hypothetical outcome plots, densities, and quantile dotplots, each with and without means added. Participants estimate the effect size of an intervention and make an incentivized decision whether or not to pay for that intervention. Our results suggest that many users rely on the sub-optimal strategy of judging the distance between distributions while ignoring uncertainty. Visualization designs that support the least biased estimation do not support the best decisions, suggesting that a chart user’s sense of the signal in a chart may vary for different tasks.

Boba: Authoring and Visualizing Multiverse Analyses
VIS 2020
Yang Liu, Alex Kale, Tim Althoff, and Jeffrey Heer

Boba is an integrated domain-specific language (DSL) and visual analysis system for authoring and visually exploring multiverse analyses. The Boba DSL enables analysts to specify decision spaces in data analysis, supporting multiple scripting languages. The Boba Visualizer provides linked views of the decision space and model results to enable rapid, systematic assessment of robustness of results to decision alternatives, sampling uncertainty, and model fit.

Adaptation and Learning Priors in Visual Inference
Position Paper
VIS 2019
Alex Kale and Jessica Hullman

We review the vision science literature on adaption, a set of processes by which the visual system adjusts incoming sensory signals based on previous visual experience. We present an explanation of visual adaptation that is tailored to the visualization community and consider the implications of adaptation when designing for inferences from visualized data.

Decision-Making Under Uncertainty in Research Synthesis: Designing for the Garden of Forking Paths
CHI 2019
Alex Kale, Matthew Kay, and Jessica Hullman

We study decision-making strategies used by researchers conducting systematic review and meta-analysis. Integrating prior work from judgment & decision-making, reproducable statistics, and uncertainty visualization, we point to challenges and opportunities for the design of interactive systemts to support research synthesis.

Capture & Analysis of Active Reading Behaviors for Interactive Articles on the Web
EuroVis 2019
Matt Conlen, Alex Kale, and Jeffrey Heer

Interactive articles are increasingly popular online, yet their effectiveness at engaging readers has not been widely studied with real-world audiences. We developed tools for instrumentation and analysis to make it easier for researchers and publishers to understand how readers are reacting to this new media.

Hypothetical Outcome Plots Help Untrained Observers Judge Trends in Ambiguous Data
InfoVis 2018
Alex Kale, Francis Nguyen, Matthew Kay and Jessica Hullman

We present two experiments evaluating four different uncertainty visualizations: bar graphs with error bars, bar hypothetical outcome plots (HOPs), static line ensembles, and line HOPs. Untrained users are able to distinguish trends at lower signal-to-noise ratios when shown HOPs that express sampling error through animated samples.

In Pursuit of Error: A Survey of Uncertainty Visualization Evaluation
InfoVis 2018
Jessica Hullman, Xiaoli Qiao, Michael Correll, Alex Kale, and Matthew Kay

We present a review and analysis of evaluation methods for uncertainty visualizations, resulting in a collection of 372 evaluation paths observed across a sample of 86 publications.

Invited Talks

These are my recent talks in addition those accompanying conference papers.

Expect Users to Satisfice: Designing Interfaces for Reasoning with Uncertainty

SDSS 2020 - User Testing Statistical Graphics
With the proliferation of data-driven products and media in public life, conventional statistical graphics tend to emphasize point estimates and omit uncertainty information. However, given that research in visualization, psychology, and behavioral economics shows that people often satisfice (i.e., use heuristics that deviate from the optimal strategy) when reasoning with uncertainty, users of data visualizations may not recognize or correctly interpret uncertainty. I argue that we need to understand users’ potential reasoning strategies in order to design graphical interfaces that steer users toward more systematic ways of reasoning with uncertainty. In this talk, I present empirical evidence on how users satisfice, both when reading individual charts and when conducting analysis, and I discuss ways we are designing statistical graphics and interfaces for data analysis that anticipate users’ tendency to satisfice.


As a PhD student at the University of Washington Information School, I have been a teaching assistant for two classes, Introduction to Data Science and Interactive Data Visualization. I also spent one year leading the Computer Science and Engineering Department's graduate seminar on human computer interaction.

I helped teach Introduction to Data Science (INFO 180) with Jevin West. The course was an experimental flipped classroom in which students progressed through the course in a gameified web application created by Bob Boiko. A student's objective was to pass a set of 7 challenges by the end of the quarter. Taking these challenges required students to interact with their two TAs. I held ten office hours per week, in addition to class time, providing one-on-one and group instruction on programming in Python and introductory topics in data science and statistics. In course evaluations and in person, students have told me how the one-on-one instruction in office hours helped them to grasp concepts they were stuggling with, learn from their mistakes, and apply the knowledge and skills we built together in future endeavors. Teaching this course was a transformative experience and left me with questions about how to scale and adapt this unothodox but rewarding approach to teaching.

I also helped to teach Interactive Data Visualization (INFO 474) with Yea-Seul Kim. The course consisted of a series of lectures, in-class activities, and projects aimed at teaching students the theoretical and empirical foundations of visualization design and prompting students to put this knowledge into practice. My role as a TA was primarily to develop and execute grading schemes that balanced learning objectives with the needs of our students. I also held office hours where I helped students mock-up their projects, reason through implementation using psuedo-code, and refine their understanding of difficult concepts from lectures and readings.

I've given invited lectures on perception, color vision, and uncertainty visualization in both the UW Computer Science and Engineering department's Data Visualization course (CSE 442) and the Information School's Interactive Data Visualization course (INFO 474).

In academic year 2019-2020, I co-led the UW Computer Science and Engineering department's graduate-level Interactive Systems Seminar (CSE 590H) with Jasper Tran O'Leary under the supervision of James Fogarty. Each week we read and discussed a paper relating to human computer interaction. Our discussions focused on design principles and requirements of different systems, empirical evaluation, and critique and synthesis of the literature. My role was to organize and facilitate seminar, as well as being a regular discussion leader.

Prior to graduate school I was also an undergraduate TA for a series of statistics courses in the UW Psychology Department taught by Laura Little.

Get in touch

or follow me online