Overly positive feedback leads to bad assistive technology
Sign language gloves, intended to help the visually impaired, turn spelling into text on a smartphone or computer. But finger spelling is only a small part of the American Sign Language lexicon, and gloves lack vital facial expressions and broader hand gestures. As a result, the deaf community has largely rejected the gloves.
To create products that people actually want to use, researchers must overcome response bias – overly positive comments about new technology. Developers who design technology for underserved communities need to hear not only good things about their efforts, but negative comments as well.
âSometimes when people are working with marginalized populations, their voices aren’t necessarily heard or the technology isn’t really right for them,â said Joy Ming, graduate student in information science and first author of â Accept or Address? Researchers’ Perspectives on Response Bias in Accessibility Research, presented at the Association of Computing Machinery Conference on Computers and Accessibility in October, where he was nominated for the best paper award.
âWe wanted to understand how research can be done in a way that helps both researchers and participants to be more critical about their responses to the technology,â Ming said.
Researchers have found that building relationships within the disability community, ensuring that the research environment and methods are fully accessible, and framing interaction as a collaborative or exploratory experience are all ways to reduce response bias.
Although this is a common problem in all studies involving people, where participants give less specific comments, there is little research on response bias when working with people with disabilities.
“There are huge differences in power due to different factors between us as researchers and our participants,” said Aditya Vashistha, assistant professor of information science at Cornell Ann S. Bowers College of Computing and Information Science and lead author of the new study. As a result, researchers often receive polite praise out of a natural tendency to be helpful or to encourage future research on assistive technology, he said.
âWe have to think critically about how we design studies, so when the participants are in the room, we actually hear what they think about the technology,â said Vashistha.
To find out more, Ming and Sharon Heung, another graduate student in information science, asked 27 disability researchers about response bias and analyzed their responses.
Interviewees shared several reasons why their disabled study participants were reluctant to give critical feedback. Some of these participants, especially children, may feel pressured to participate, while others may not feel qualified to critique the work of tech-savvy designers. Scientists also reported that some subjects displayed a “positive attitude,” including a participant who repeatedly attempted to use a broken prototype. They thought they weren’t using the technology instead of failing it.
The design of a study can also lead to response bias. For example, if a researcher collects only sound recordings and a child with autism spectrum disorder responds non-verbally, those responses will be missed. Poorly worded questions, failure to recruit diverse participants, and researchers’ attitudes towards people with disabilities can all exacerbate the problem.
In future experiments, researchers plan to test different approaches to see if they influence and mitigate response bias. They also want to hear the views of participants in these studies to understand why they were able to withhold negative comments. Finally, the researchers hope this work will be a starting point for developing best practices to help other scientists develop assistive technologies more effectively.
“We hope to reduce the barrier to even doing accessibility research for other researchers,” Heung said, “and to ensure that the research process itself is designed for people with disabilities.”
Shiri Azenkot, associate professor at the Jacobs Technion-Cornell Institute at Cornell Tech is co-author of the article.
Patricia Waldron is a freelance writer for the Cornell Ann S. Bowers College of Computing and Information Science.