By Patricia Waldron
To create products people actually want to use, researchers need to overcome response bias – overly positive feedback about new tech. Developers designing tech for underserved communities need to hear not only nice things about their efforts, but negative feedback, too.
“Sometimes when people work with marginalized populations, their voices aren’t necessarily heard, or the technology isn’t really well fit for them,” said Joy Ming, a graduate student in the field of information science and first author of “Accept or Address? Researchers’ Perspectives on Response Bias in Accessibility Research,” presented at the Association of Computing Machinery’s Conference on Computers and Accessibility in October, where it was nominated for a Best Paper award.
“We wanted to understand how research can be done in a way that helps both the researchers and the participants be more critical about their responses to technology,” Ming said.
The researchers found that building relationships within the disability community, ensuring the environment and research methods are fully accessible and framing the interaction as a collaborative or exploratory experience are all ways to reduce response bias.
Though it’s a common problem in all studies involving people, where participants give less-than-accurate feedback, little research exists on response bias when working with people with disabilities.
“There are huge power differentials because of different factors between us as researchers and our participants,” said Aditya Vashistha, assistant professor of information science at the Cornell Ann S. Bowers College of Computing and Information Science and senior author of the new study. As a result, researchers often receive polite praise out of the natural tendency to be helpful or to encourage future research in assistive technologies, he said.
“We need to think critically of how we design studies, so when participants are in the room, we’re actually hearing what they feel about the technology,” Vashistha said.
To learn more, Ming and Sharon Heung, a fellow graduate student in the field of information science, interviewed 27 disability researchers about response bias and analyzed their responses.
The interviewees shared multiple reasons that their study participants with disabilities shied away from giving critical feedback. Some of those participants, especially children, may have felt coerced to participate, while others may not have felt qualified to critique the work of tech-savvy designers. The scientists also reported that some subjects displayed a “can-do attitude,” including one participant who repeatedly attempted to use a broken prototype. They thought they were failing to use the tech instead of the tech failing them.
A study’s design can also lead to response bias. For example, if a researcher collects only sound recordings and a child with autism spectrum disorder responds nonverbally, those answers will be missed. Poorly worded questions, failing to recruit diverse participants, and the researchers’ attitudes towards people with disabilities can all exacerbate the problem.
In future experiments, the researchers are considering testing different approaches to see whether they influence and mitigate response bias. They also want to gather perspectives from participants in these studies to understand why they may have held back negative comments. Finally, the researchers hope this work will be a starting point to develop best practices to help other scientists develop assistive technologies more effectively.
“We’re hoping to reduce the barrier to even get into doing accessibility research for other researchers,” Heung said, “and to make sure that the research process itself is made for people with disabilities.”
Shiri Azenkot, an associate professor at the Jacobs Technion-Cornell Institute at Cornell Tech is a co-author on the paper.
Patricia Waldron is a freelance writer for the Cornell Ann S. Bowers College of Computing and Information Science.