Interactive Gesture-Based Data Manipulation and Visualization
for Exploratory Learning and Research
Visual exploration and analysis of data is increasingly important for advancement in virtually every area of human endeavor. Whether recorded directly by people or indirectly using machines, data captures our observations and interpretations of the world.
When people interact with data, it is almost always in a visual form like graphics or text. The goal of this project is to vastly expand the usefulness of interactive visualizations by providing a general way to create and edit data inside the visualizations themselves.
The key new idea of the project is that visualization users can perform sequences of gestures with common input devices to express their observations and interpretations directly in visual form. The visualizations not only show data, but also serve as meaningful graphical spaces in which to edit that data.
By extending the data processing workflows and display techniques that are currently used in popular visualization tools and software libraries, we can flexibly and expressively translate the details of interactions into precise data changes with simultaneous visual feedback.
The innovative contributions of the project will include a general method to support interactive data editing in visualizations, a diverse collection of data editing gestures, a set of patterns to guide the process of designing visualization tools with data editing features, a declarative programming language for quickly building those tools, and a variety of built tools that show off real applications of data editing in visualizations.
The project focuses on developing, evaluating, and distributing tools for scholarly research in the digital humanities. It tightly integrates education to bring together students and researchers from computer science, information science, and the humanities, and provide them with concrete opportunities to engage in authentic interdisciplinary collaboration.
Scholarly research and education in the humanities involves open-ended exploration, analysis, and interpretation of complex data sets in diverse areas of study. This makes it an exemplary first target to demonstrate how gesture-based visual editing can be broadly applied to data analysis in virtually every segment of society. The broader impacts of the project will spring from the availability of a new, foundational, general-purpose methodology to support data entry, organization, annotation, and correction.
Ampliation Desktop
Project products will include publications, tutorials, videos, the visualization gesture system as open source software, a compendium of data editing gestures, and a gallery of demonstration visualization tools for public download. Ongoing information about the project and links to the resulting resources will appear on this page as they develop.
InfoVis Interaction Survey
We conducted a comprehensive survey of visualization interactions in the 570 papers published at the IEEE Conference on Information Visualization from 1995 to 2014. Our approach involves accumulating full bibliographic metadata on the papers, scanning each paper for candidate interactions, performing close reading to record and interpret authors’ explicit and implicit descriptions of interactions in their technique/tool/system, then recording details about each interaction into a database. This process has so far resulted in 486 identified interactions in the visualizations presented in 175 different papers.
Based on this survey, we created InfoVis paper trading cards as a way to ask InfoVis 2015 attendees to help us classify selection interactions on our poster, Examining the Many Faces of Selection:
Surprisingly, we found that very few visualizations in the InfoVis literature include interactions that can readily thought of as gesture-based data editing interactions. Most interactions are common navigations or variations of basic brushing selection. Many of the remaining interactions are form/widget based data value editors. The latter can be thought of as editing a user-specific boolean-valued attribute for each data item. This result has led us to rethink the role of data editing in information visualization. On the one hand, there are few existing examples from which to determine a taxonomy of editing interactions and from there to define a language for declaring gesture-based interactions in visualization designs. On the other hand, there is enormous opportunity for us to design these interactions with few prior constraints or pre-conceptions, allowing great flexibility and creativity. Our new approach is to map combinations of geometric characteristics of device operations into constrained data value domains, allowing designers to design the data value outputs of gestures as a function of geometric descriptors.
Designing Gestures for Data Editing
The project is helping to significantly expand the theory of visualization design. The human-computer interaction loop of data visualization can be usefully reinterpreted as drawing/painting/sculpting that is indirectly backed by data rather than directly backed by configurations of physical media. Interactions in visualization can follow abstract rules of computation that are less constrained than the concrete rules of physics. Visualization interaction need not be limited to direct manipulation of space (navigation) or pointing at objects in it (selection). Interaction design in data visualization can be extended beyond navigation and selection in a straightforward way.
Gestures are movements in a spatial context. They possess geometry and other characteristics. They can affect the geometry and other characteristics of the spatial context, the objects in it, or both. From this perspective, navigation is a special case of interaction that transforms context geometry. Selection is a special case of interaction that indicates object subsets. Of course, numerous other interactions exist in visualization tools, but all (or the vast majority) happen in a halo of user interface beyond the visualization model. Many of these interactions can be captured under the gesture-based data editing model.
Interactive visual data editing can be modeled on top of existing user input mechanisms based on time sampling of event streams from physical input devices. The process of mapping events into data edits can be decomposed into a small sequence of relatively simple declarations. We have specified a semi-formal model that successfully breaks down event processing for data editing into eight steps: match conditions, accumulate events, aggregate events, match event patterns, extract geometries, aggregate geometries, map features into feedback, and map features into data/parameters.
We have applied the model to specify several example visualization gestures including both well-known cases (e.g., point selection) and previously unknown ones (e.g., half-plane selection):
Application of the model appears to be highly understandable, expressive, economical, and reusable. We anticipate this approach to data editing will carry over easily and with extreme effectiveness into visualization design practice. Beyond our own efforts on application to the digital humanities in this project, we are increasingly hopeful that the model could radically open up the interaction design space to study and broad application throughout the visualization community.
Student Opportunities
This five year project involves collaboration with digital humanities scholars and educators from OU History of Science, OU Library & Information Science, and the Stanford Humanities Center. There is currently one graduate research assistant working with me on this project. I will be looking for one additional PhD-level GRA for this project to start in August 2017. In the future there may be additional PhD-level GRA opportunities to work on data collection, software implementation, visualization development, and technical support for project-related research activities of our collaborators, or to provide assistance in visualization design, usability evaluation, educational outreach, and technical support for project-related education activities of our collaborators. Potential future PhD students are encouraged to inquire about prospects for these positions. Inquiries from PhD students outside of CS are welcome.
I often advise graduate and undergraduate students in the areas of information visualization, visual analytics, databases, human-computer interaction, and the digital humanities. There are many possibilities for Master's theses, independent study, and undergraduate honors research within the scope of this project.
OU students in ANY field interested in participating in this project in a less formal way, such as beta testing of the software as it develops, are welcome to contact me. Although the project itself targets humanities learning and research, the software and other products of the project will be general-purpose.
2014.08.27 - The Oklahoma Daily (OU student newspaper; article):
“Data manipulation may soon be easy”
2014.08.12 - The Norman Transcript (press release):
“OU professor awarded National Science Foundation CAREER grant”
Supported Publications and Presentations