In recent years, surveys have been shifting online, offering the possibility for adaptive questions, where later questions depend on responses to earlier questions. We present a general framework for dynamically ordering questions, based on previous responses, to engage respondents, improving survey completion and imputation of unknown items. Our work considers two scenarios for data collection from survey-takers. In the first, we want to maximize survey completion (and the quality of necessary imputations) and so we focus on ordering questions to engage the respondent and collect hopefully all the information we seek, or at least the information that most characterizes the respondent so imputed values will be accurate. In the second scenario, our goal is to give the respondent a personalized prediction, based on information they provide. Since it is possible to give a reasonable prediction with only a subset of questions, we are not concerned with motivating the user to answer all questions. Instead, we want to order questions so that the user provides information that most reduces the uncertainty of our prediction, while not being too burdensome to answer.
Publications
Kirstin Early, Stephen E. Fienberg, Jennifer Mankoff. (2016). Test time feature ordering with FOCUS: Interactive predictions with minimal user burden. In Proceedings of 2016 ACM Conference on Pervasive and Ubiquitous Computing. Honorable Mention: Top 5% of submissions. Talk slides.