Understanding gender equity in author order assignment

Academic success and promotion are heavily influenced by publication record. In many fields, including computer science, multi-author papers are the norm. Evidence from other fields shows that norms for ordering author names can influence the assignment of credit. We interviewed 38 students and faculty in human- computer interaction (HCI) and machine learning (ML) at two institutions to determine factors related to assignment of author order in collaborative publication in the field of computer science. We found that women were concerned with author order earlier in the process:

Our female interviews reported raising author order in discussion earlier in the process than men.

Interview outcomes informed metrics for our bibliometric analysis of gender and collaboration in papers published between 1996 and 2016 in three top HCI and ML conferences. We found expected results overall — being the most junior author increased the likelihood of first authorship, while being the most senior author increased the likelihood of last authorship. However, these effects disappeared or even reversed for women authors:

Comparison of regression weights for author rank (blue) with author rank crossed with gender (orange). Regression was predicting author position (first, middle, last)

Based on our findings, we make recommendations for assignment of credit in multi-author papers and interpretation of author order, particularly with respect to how these factors affect women.

3D Printed Wireless Analytics

Wireless Analytics for 3D Printed Objects: Vikram Iyer, Justin Chan, Ian Culhane, Jennifer Mankoff, Shyam Gollakota UIST, Oct. 2018 [PDF]

We created a wireless physical analytics system works with commonly available conductive plastic filaments. Our design can enable various data capture and wireless physical analytics capabilities for 3D printed objects, without the need for electronics.

We make three key contributions:

(1) We demonstrate room scale backscatter communication and sensing using conductive plastic filaments.

(2) We introduce the first backscatter designs that detect a variety of bi-directional motions and support linear and rotational movements. An example is shown below

(3) As shown in the image below, we enable data capture and storage for later retrieval when outside the range of the wireless coverage, using a ratchet and gear system.

We validate our approach by wirelessly detecting the opening and closing of a pill bottle, capturing the joint angles of a 3D printed e-NABLE prosthetic hand, and an insulin pen that can store information to track its use outside the range of a wireless receiver.

Selected Media

6 of the most amazing things that were 3D-printed in 2018 (Erin Winick, MIT Technology Review, 12/24/2018)

Researchers develop 3D printed objects that can track and store how they are used (Sarah McQuate), UW Press release. 10/9/2018

Assistive Objects Can Track Their Own Use (Elizabeth Montalbano), Design News. 11/14/2018

People

Students

Vikram Iyer
Justin Chan
Ian Culhane

Faculty

Jennifer Mankoff
Shyam Gollakota

Contact: printedanalytics@cs.washington.edu

Minxuan Gao

Hi, I’m Minxuan Gao and I’m a senior in Tsinghua University majoring in Software Engineering. I’m always passionate about creating new and innovative way of people interacting with every day objects by seeing, touching, listening using data-driven methods. My research focus lies in Human Computer Interaction and I am currently working on the SPRITEs project.

Yasaman Sefigar

Headshot of Yasaman wearing a green scarf smiling

I am a PhD student at the University of Washington’s Paul G. Allen School of Computer Science and Engineering. My current research is focused on human behavior modeling. More specifically, I model and study routine behaviors and the impact of external events on them in the context of wellbeing and mobility. I am also interested in end-user tools and interfaces to improve data collection, exploration, and analysis processes.

My past research spans from designing interfaces for end-user robot programming, to modeling human-object interactions in realistic videos, to studying affective haptic human-robot interaction for psychological enrichment.

My Google Scholar page is https://goo.gl/D1QbSJ

Some recent projects (see more)

Venkatesh Potluri

Venkatesh Potluri is a Ph.D. student at the Paul G. Allen Center for Computer Science & Engineering at University of Washington. He is advised by Prof Jennifer Mankoff and Prof Jon Froehlich. Venkatesh believes that technology, when designed right, empowers everybody to fulfill their goals and aspirations. His broad research goals are to upgrade accessibility to the ever-changing ways of our interactions with technology, and, improve the independence and quality of life of people with disabilities. These goals stem from his personal experience as a researcher with a visual impairment. His research focus is to enable developers with visual impairments perform a variety of programming tasks efficiently. Previously, he was a Research Fellow at Microsoft Research India, where his team was responsible for building CodeTalk, an accessibility framework and a plugin for better IDE accessibility. Venkatesh earned a master’s degree in Computer Science at International Institute of Information Technology Hyderabad, where his research was on audio rendering of mathematical content.

You can find more information about him at https://venkateshpotluri.me

Xin Liu

Xin is a first-year Ph.D. student with Jennifer Mankoff and Shwetak Patel in the Paul G. Allen School of Computer Science & Engineering at the University of Washington – Seattle. Prior to joining UW, he obtained a Bachelor’s degree in computer science from the University of Massachusetts Amherst in 2018. While at UMass Amherst, he received a 21st Century Leaders Award, Rising Researcher Award, and Outstanding Undergraduate Achievements Award. He is interested in using wearable sensing, human-computer interaction and machine learning to advancing healthcare.

Website: https://homes.cs.washington.edu/~xliu0/

Orson (Xuhai) Xu (PhD, co-advised with Anind Dey)

Orson is a Ph.D. student working with Jennifer Mankoff  and Anind K. Dey in the Information School at the University of Washington – Seattle. Prior to joining UW, he obtained his Bachelor’s degrees in Industrial Engineering (major) and Computer Science (minor) from Tsinghua University in 2018. While at Tsinghua, he received Best Paper Honorable Mentioned Award (CHI 2018), Person of the Year Award and Outstanding Undergraduate Awards. His research focuses on two aspects in the intersection of human-computer interaction, ubiquitous computing and machine learning: 1) the modeling of human behavior such as routine behavior and 2) novel interaction techniques.

Visit Orson’s homepage at : orsonxu.com

Some recent projects (see more)

Interactiles

The absence of tactile cues such as keys and buttons makes touchscreens difficult to navigate for people with visual impairments. Increasing tactile feedback and tangible interaction on touchscreens can improve their accessibility. However, prior solutions have either required hardware customization or provided limited functionality with static overlays. In addition, the investigation of tactile solutions for large touchscreens may not address the challenges on mobile devices. We therefore present Interactiles, a low-cost, portable, and unpowered system that enhances tactile interaction on Android touchscreen phones. Interactiles consists of 3D-printed hardware interfaces and software that maps interaction with that hardware to manipulation of a mobile app. The system is compatible with the built-in screen reader without requiring modification of existing mobile apps. We describe the design and implementation of Interactiles, and we evaluate its improvement in task performance and the user experience it enables with people who are blind or have low vision.

XiaoyiZhang, TracyTran, YuqianSun, IanCulhane, ShobhitJain, JamesFogarty, JenniferMankoff: Interactiles: 3D Printed Tactile Interfaces to Enhance Mobile Touchscreen Accessibility. ASSETS 2018: To Appear [PDF]

Figure 2. Floating windows created for number pad (left), scrollbar (right) and control button (right bottom). The windows can be transparent; we use colors for demonstration.

Figure 4. Average task completion times of all tasks in the study.

EDigs

eDigs logoJennifer MankoffDimeji OnafuwaKirstin EarlyNidhi VyasVikram Kamath:
Understanding the Needs of Prospective Tenants. COMPASS 2018: 36:1-36:10

EDigs is a research project group in Carnegie Mellon University working on sustainability. Our research is focused on helping people find a perfect rental through machine learning and user research.

We sometimes study how our members use EDigs in order to learn how to build software support for successful social communities.

eDigs websiteScreenshot of edigs.org showing a mobile app, facebook and twitter feeds, and information about it.