Clench Interaction: Biting As Input

Shows human faced diagraming where the clench sensor should be placed between the teeth; the settings for correctly sensing clench, and the hardware platform used.
Xuhai Xu, Chun Yu, Anind K. Dey, Jennifer Mankoff
Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI ’19)
People eat every day and biting is one of the most fundamental and natural actions that they perform on a daily basis. Existing work has explored tooth click location and jaw movement as input techniques, however clenching has the potential to add control to this input channel. We propose clench interaction that leverages clenching as an actively controlled physiological signal that can facilitate interactions. We conducted a user study to investigate users’ ability to control their clench force. We found that users can easily discriminate three force levels, and that they can quickly confirm actions by unclenching (quick release). We developed a design space for clench interaction based on the results and investigated the usability of the clench interface. Participants preferred the clench over baselines and indicated a willingness to use clench-based interactions. This novel technique can provide an additional input method in cases where users’ eyes or hands are busy, augment immersive experiences such as virtual/augmented reality, and assist individuals with disabilities.

Automatically Tracking and Executing Green Actions

We believe that self-reporting is a limiting factor in the original vision of StepGreen.org, and this component of our research has begun to explore alternatives. For example, we showed that financial data can be used to extract footprint information [1], and in collaboration with researchers at Intel and University of Washington, we used a mobile device to track and visualize green transportation behavior in the Ubigreen project (published at CHI 2009 [2]). We have also worked on algorithms to predict the indoor location of work and home arrival time of residential building occupants so as to automatically minimize thermostat use [3, 4]. Finally, we moved away from individual behavioral remedies to structural remedies by exploring tools that could help tenants to pick greener apartments [5]

[1] J. Schwartz, J. Mankoff, H. Scott Matthews. Reflections of everyday activity in spending data. In Proceedings of CHI 2009.  (Note). (pdf)

[2] J. Froehlich, T. Dillahunt, P. Klasnja, J. Mankoff, S. Consolvo, B. Harrison, J. A. Landay, UbiGreen: Investigating a Mobile Tool for Tracking and Supporting Green Transportation Habits. In Proceedings of CHI 2009. (Full paper) (pdf)

[3] Indoor-ALPS: an adaptive indoor location prediction system Christian Koehler, Nikola Banovic, Ian Oakley, Jennifer Mankoff, Anind K. Dey
UbiComp ’14 Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing, 2014

[4] TherML: occupancy prediction for thermostat control Christian Koehler, Brian D. Ziebart, Jennifer Mankoff, Anind K. Dey UbiComp ’13 Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing, 2013

[5] Jennifer Mankoff, Dimeji Onafuwa, Kirstin Early, Nidhi Vyas, Vikram Kamath Cannanure: Understanding the Needs of Prospective Tenants. COMPASS 2018: 36:1-36:10

Orson (Xuhai) Xu (PhD, co-advised with Anind Dey)

Orson is a Ph.D. student working with Jennifer Mankoff  and Anind K. Dey in the Information School at the University of Washington – Seattle. Prior to joining UW, he obtained his Bachelor’s degrees in Industrial Engineering (major) and Computer Science (minor) from Tsinghua University in 2018. While at Tsinghua, he received Best Paper Honorable Mentioned Award (CHI 2018), Person of the Year Award and Outstanding Undergraduate Awards. His research focuses on two aspects in the intersection of human-computer interaction, ubiquitous computing and machine learning: 1) the modeling of human behavior such as routine behavior and 2) novel interaction techniques.

Visit Orson’s homepage at : orsonxu.com

Some recent projects (see more)

Interactiles

The absence of tactile cues such as keys and buttons makes touchscreens difficult to navigate for people with visual impairments. Increasing tactile feedback and tangible interaction on touchscreens can improve their accessibility. However, prior solutions have either required hardware customization or provided limited functionality with static overlays. In addition, the investigation of tactile solutions for large touchscreens may not address the challenges on mobile devices. We therefore present Interactiles, a low-cost, portable, and unpowered system that enhances tactile interaction on Android touchscreen phones. Interactiles consists of 3D-printed hardware interfaces and software that maps interaction with that hardware to manipulation of a mobile app. The system is compatible with the built-in screen reader without requiring modification of existing mobile apps. We describe the design and implementation of Interactiles, and we evaluate its improvement in task performance and the user experience it enables with people who are blind or have low vision.

XiaoyiZhang, TracyTran, YuqianSun, IanCulhane, ShobhitJain, JamesFogarty, JenniferMankoff: Interactiles: 3D Printed Tactile Interfaces to Enhance Mobile Touchscreen Accessibility. ASSETS 2018: To Appear [PDF]

Figure 2. Floating windows created for number pad (left), scrollbar (right) and control button (right bottom). The windows can be transparent; we use colors for demonstration.
Figure 4. Average task completion times of all tasks in the study.

Exiting the cleanroom: On ecological validity and ubiquitous computing

Carter, Scott, Jennifer Mankoff, Scott R. Klemmer, and Tara Matthews. “Exiting the cleanroom: On ecological validity and ubiquitous computing.” Human–Computer Interaction 23, no. 1 (2008): 47-99.

Over the past decade and a half, corporations and academies have invested considerable time and money in the realization of ubiquitous computing. Yet design approaches that yield ecologically valid understandings of ubiquitous computing systems, which can help designers make design decisions based on how systems perform in the context of actual experience, remain rare. The central question underlying this article is, What barriers stand in the way of real-world, ecologically valid design for ubicomp?

Using a literature survey and interviews with 28 developers, we illustrate how issues of sensing and scale cause ubicomp systems to resist iteration, prototype creation, and ecologically valid evaluation. In particular, we found that developers have difficulty creating prototypes that are both robust enough for realistic use and able to handle ambiguity and error and that they struggle to gather useful data from evaluations because critical events occur infrequently, because the level of use necessary to evaluate the system is difficult to maintain, or because the evaluation itself interferes with use of the system. We outline pitfalls for developers to avoid as well as practical solutions, and we draw on our results to outline research challenges for the future. Crucially, we do not argue for particular processes, sets of metrics, or intended outcomes, but rather we focus on prototyping tools and evaluation methods that support realistic use in realistic settings that can be selected according to the needs and goals of a particular developer or researcher.