KnitPick: Programming and Modifying Complex Knitted Textures for Machine and Hand Knitting

Knitting creates complex, soft objects with unique and controllable texture properties that can be used to create interactive objects. However, little work addresses the challenges of using knitted textures. We present KnitPick: a pipeline for interpreting pre-existing hand-knitting texture patterns into a directed-graph representation of knittable structures (KnitGraphs) which can be output to machine and hand-knitting instructions. Using KnitPick, we contribute a measured and photographed data set of 300 knitted textures. Based on findings from this data set, we contribute two algorithms for manipulating KnitGraphs. KnitCarving shapes a graph while respecting a texture, and KnitPatching combines graphs with disparate textures while maintaining a consistent shape. Using these algorithms and textures in our data set we are able to create three Knitting based interactions: roll, tug, and slide. KnitPick is the first system to bridge the gap between hand- and machine-knitting when creating complex knitted textures.

KnitPick: Programming and Modifying Complex Knitted Textures for Machine and Hand Knitting, Megan Hofmann, Lea Albaugh, Ticha Sethapakdi, Jessica Hodgins, Scott e. Hudson, James McCann, Jennifer Mankoff. UIST 2019. To Appear.

A picture of a knit speak file which is compiled into a knit graph (which can be modified using carving and patching) and then compiled to knitout, which can be printed on a knitting machine. Below the graph is a picture of different sorts of lace textures supported by knitpick.
KnitPick converts KnitSpeak into KnitGraphs which can be carved, patched and output to knitted results
A photograph of the table with our data measurement setup, along with piles of patches that are about to be measured and have recently been measured. One patch is attached to the rods and clips used for stretching.
Data set measurement setup, including camera, scale, and stretching rig
A series of five images, each progressively skinnier than the previous. Each image is a knitted texture with 4 stars on it. They are labeled (a) original swatch (b) 6 columns removed (c) 9 columns removed (d) 12 columns removed (e) 15 columns removed
The above images show a progression from the original Star texture to the same texture with 15 columns removed by texture carving. These photographs were shown to crowd-workers who rated their similarity. Even with a whole repetition width removed from the Stars, the pattern remains a recognizable star pattern.

Point-of-Care Manufacturing: Maker Perspectives on Digital Fabrication in Medical Practice

A venn diagram showing the domains of expertise of those we interviewed including people from hospitals, universities, non-profits, va networks, private practices, and government. We interviewed clinicians and facilitators in each of these domains and there was a great deal of overlap with participants falling into multiple categories. For example, one participant was in a VA network and in private practice, while another was at a university and also a non-profit.

Maker culture in health care is on the rise with the rapid adoption of consumer-grade fabrication technologies. However, little is known about the activity and resources involved in prototyping medical devices to improve patient care. In this paper, we characterize medical making based on a qualitative study of medical stakeholder engagement in physical prototyping (making) experiences. We examine perspectives from diverse stakeholders including clinicians, engineers, administrators, and medical researchers. Through 18 semi-structured interviews with medical-makers in US and Canada, we analyze making activity in medical settings. We find that medical-makers share strategies to address risks, define labor roles, and acquire resources by adapting traditional structures or creating new infrastructures. Our findings outline how medical-makers mitigate risks for patient safety, collaborate with local and global stakeholder networks, and overcome constraints of co-location and material practices. We recommend a clinician-aided software system, partially-open repositories, and a collaborative skill-share social network to extend their strategies in support of medical making.

“Point-of-Care Manufacturing”: Maker Perspectives onDigital Fabrication in Medical Practice. Udaya Lakshmi, Megan Hofmann, Stephanie Valencia, Lauren Wilcox, Jennifer Mankoff and Rosa Arriaga. CSCW 2019. To Appear.

A venn diagram showing the domains of expertise of those we interviewed including people from hospitals, universities, non-profits, va networks, private practices, and government. We interviewed clinicians and facilitators in each of these domains and there was a great deal of overlap with participants falling into multiple categories. For example, one participant was in a VA network and in private practice, while another was at a university and also a non-profit.

Designing in the Public Square

A Makapo paddler in a one-person outrigger canoe (OC1) with the final version of CoOP attached.

Design in the Public Square: Supporting Cooperative Assistive Technology Design Through Public Mixed-Ability Collaboration (CSCW 2019)

Mark. S. Baldwin, Sen H Hirano, Jennifer Mankoff, Gillian Hayes

From the white cane to the smartphone, technology has been an effective tool for broadening blind and low vision participation in a sighted world. In the face of this increased participation, individuals with visual impairments remain on the periphery of most sight-first activities. In this paper, we describe a multi-month public-facing co-design engagement with an organization that supports blind and low vision outrigger paddling. Using a mixed-ability design team, we developed an inexpensive cooperative outrigger paddling system, called DEVICE, that shares control between sighted and visually impaired paddlers. The results suggest that public design, a DIY (do-it-yourself) stance, and attentiveness to shared physical experiences, represent key strategies for creating assistive technologies that support shared experiences.

A close-up of version three of the CoOP system mounted to the rudder assembly and the transmitter
used to control the rudder (right corner).
Shows 5 iterations of the CoOP system, each of which is progressively less bulky, and more integrated (the first is strapped on for example and the last is more integrated).
The design evolution of the CoOP system in order of iteration from left to right.

“Occupational Therapy is Making”: Clinical Rapid Prototyping and Digital Fabrication

Splint that has been 3D printed in a material of an appropriate skin color and fit to a client's hand.

3D Printed Wireless Analytics

Picture of a 3D printed arm with backscatter sensing technology attached to it.

Wireless Analytics for 3D Printed Objects: Vikram Iyer, Justin Chan, Ian Culhane, Jennifer Mankoff, Shyam Gollakota UIST, Oct. 2018 [PDF]

We created a wireless physical analytics system works with commonly available conductive plastic filaments. Our design can enable various data capture and wireless physical analytics capabilities for 3D printed objects, without the need for electronics.

We make three key contributions:

(1) We demonstrate room scale backscatter communication and sensing using conductive plastic filaments.

(2) We introduce the first backscatter designs that detect a variety of bi-directional motions and support linear and rotational movements. An example is shown below

(3) As shown in the image below, we enable data capture and storage for later retrieval when outside the range of the wireless coverage, using a ratchet and gear system.

We validate our approach by wirelessly detecting the opening and closing of a pill bottle, capturing the joint angles of a 3D printed e-NABLE prosthetic hand, and an insulin pen that can store information to track its use outside the range of a wireless receiver.

Selected Media

6 of the most amazing things that were 3D-printed in 2018 (Erin Winick, MIT Technology Review, 12/24/2018)

Researchers develop 3D printed objects that can track and store how they are used (Sarah McQuate), UW Press release. 10/9/2018

Assistive Objects Can Track Their Own Use (Elizabeth Montalbano), Design News. 11/14/2018

People

Students

Vikram Iyer
Justin Chan
Ian Culhane

Faculty

Jennifer Mankoff
Shyam Gollakota

Contact: printedanalytics@cs.washington.edu

Interactiles

The absence of tactile cues such as keys and buttons makes touchscreens difficult to navigate for people with visual impairments. Increasing tactile feedback and tangible interaction on touchscreens can improve their accessibility. However, prior solutions have either required hardware customization or provided limited functionality with static overlays. In addition, the investigation of tactile solutions for large touchscreens may not address the challenges on mobile devices. We therefore present Interactiles, a low-cost, portable, and unpowered system that enhances tactile interaction on Android touchscreen phones. Interactiles consists of 3D-printed hardware interfaces and software that maps interaction with that hardware to manipulation of a mobile app. The system is compatible with the built-in screen reader without requiring modification of existing mobile apps. We describe the design and implementation of Interactiles, and we evaluate its improvement in task performance and the user experience it enables with people who are blind or have low vision.

XiaoyiZhang, TracyTran, YuqianSun, IanCulhane, ShobhitJain, JamesFogarty, JenniferMankoff: Interactiles: 3D Printed Tactile Interfaces to Enhance Mobile Touchscreen Accessibility. ASSETS 2018: To Appear [PDF]

Figure 2. Floating windows created for number pad (left), scrollbar (right) and control button (right bottom). The windows can be transparent; we use colors for demonstration.

Figure 4. Average task completion times of all tasks in the study.

Volunteer AT Fabricators

Perry-Hill, J., Shi, P., Mankoff, J. & Ashbrook, D. Understanding Volunteer AT Fabricators: Opportunities and Challenges in DIY-AT for Others in e-NABLE. Accepted to CHI 2017

We present the results of a study of e-NABLE, a distributed, collaborative volunteer effort to design and fabricate upper-limb assistive technology devices for limb-different users. Informed by interviews with 14 stakeholders in e-NABLE, including volunteers and clinicians, we discuss differences and synergies among each group with respect to motivations, skills, and perceptions of risks inherent in the project. We found that both groups are motivated to be involved in e-NABLE by the ability to use their skills to help others, and that their skill sets are complementary, but that their different perceptions of risk may result in uneven outcomes or missed expectations for end users. We offer four opportunities for design and technology to enhance the stakeholders’ abilities to work together.

Screen Shot 2017-03-14 at 1.09.13 PMA variety of 3D-printed upper-limb assistive technology devices designed and produced by volunteers in the e-NABLE community. Photos were taken by the fourth author in the e-NABLE lab on RIT’s campus.

Tactile Interfaces to Appliances

Anhong Guo, Jeeeun Kim, Xiang ‘Anthony’ Chen, Tom Yeh, Scott E. Hudson, Jennifer Mankoff, & Jeffrey P. Bigham, Facade: Auto-generating Tactile Interfaces to Appliances, In Proceedings of the 35th Annual ACM Conference on Human Factors in Computing Systems (CHI’17), Denver, CO (To appear)

Common appliances have shifted toward flat interface panels, making them inaccessible to blind people. Although blind people can label appliances with Braille stickers, doing so generally requires sighted assistance to identify the original functions and apply the labels. We introduce Facade – a crowdsourced fabrication pipeline to help blind people independently make physical interfaces accessible by adding a 3D printed augmentation of tactile buttons overlaying the original panel. Facade users capture a photo of the appliance with a readily available fiducial marker (a dollar bill) for recovering size information. This image is sent to multiple crowd workers, who work in parallel to quickly label and describe elements of the interface. Facade then generates a 3D model for a layer of tactile and pressable buttons that fits over the original controls. Finally, a home 3D printer or commercial service fabricates the layer, which is then aligned and attached to the interface by the blind person. We demonstrate the viability of Facade in a study with 11 blind participants.

5792511475098337672(1)

3D Printing with Embedded Textiles

screen-shot-2017-01-09-at-9-03-14-pm


Stretching the Bounds of 3D Printing with Embedded Textiles

Textiles are an old and well developed technology that have many desirable characteristics. They can be easily folded, twisted, deformed, or cut; some can be stretched; many are soft. Textiles can maintain their shape when placed under tension and can even be engineered with variable stretching ability.

When combined, textiles and 3D printing open up new opportunities for rapidly creating rigid objects with embedded flexibility as well as soft materials imbued with additional functionality. We introduce a suite of techniques for integrating the two and demonstrate how the malleability, stretchability and aesthetic qualities of textiles can enhance rigid printed objects, and how textiles can be augmented with functional properties enabled by 3D printing.

Click images below to see more detail:


Citation

Rivera, M.L., Moukperian, M., Ashbrook, D., Mankoff, J., Hudson, S.E. 2017. Stretching the Bounds of 3D Printing with Embedded Textiles. To appear in to the annual ACM conference on Human Factors in Computing Systems. CHI ‘17. [Paper]

Printable Adaptations

Shows someone placing a pen in a cap with two different types of adaptations.

Reprise: A Design Tool for Specifying, Generating, and Customizing 3D Printable Adaptations on Everyday Objects

Reprise is a tool for creating custom adaptive 3D printable designs for making it easier to manipulate everything from tools to zipper pulls. Reprise’s library is based on a survey of about 3,000 assistive technology and life hacks drawn from textbooks on the topic as well as Thingiverse. Using Reprise, it is possible to specify a type of action (such as grasp or pull), indicate the direction of action on a 3D model of the object being adapted, parameterize the action in a simple GUI, specify an attachment method, and produce a 3D model that is ready to print.

Xiang ‘Anthony’ Chen, Jeeeun Kim, Jennifer Mankoff, Tovi Grossman, Stelian Coros, Scott Hudson (2016). Reprise: A Design Tool for Specifying, Generating, and Customizing 3D Printable Adaptations on Everyday Objects. Proceedings of the 29th Annual ACM Symposium on User Interface Software and Technology (UIST 2016) (pdf)

This slideshow requires JavaScript.