EPIC Members Richard Beckwith and Susan Faulkner (Intel) have assembled a panel of luminaries in accessible tech research, design, and engineering for our January 26 event, Seeing Ability: Research and Development for Making Tech More Accessible. In anticipation, we asked them a few questions about their approach to accessibility and key first steps all of us can take to do more inclusive work.
How do you define ability and accessibility? How does an ethnographic lens influence your definitions?
Ability has to do with what an individual is capable of perceiving or physically doing with their body; accessibility has to do with what perceptions or physical actions are required in order to partake of some service. Disability is a complex phenomenon reflecting the interaction between features of a person’s body and features of the society in which they live. At Intel, disability includes chronic illnesses, mental and sensory conditions, neurodiversity, and physical impairments that may impact life activities.
An ethnographic lens influences us to define ability and disability in a way that is maximally inclusive, keeping an awareness that many different abilities are present in our world, and each deserves to be taken as its own “reality” and respected as such.
Reflecting on your careers as an ethnographer and research psychologist at Intel, do you think tech has been leading or lagging in developing products for people with disabilities?
One of the questions we want to address in the panel is, why is this taking so long? The Americans with Disabilities Act (ADA) has had a huge impact on the experiences of people living with disabilities, but it doesn’t mention computers, websites, apps or their required level of accessibility. That needs to change.
Technology, like the rest of our culture, has been lagging. We lag behind regulations, and we lag behind the disability community itself. It’s too easy to forget that not only does the idea of “normal” rule out many potential users but also that it is possible to design products for everyone and reach an even larger market.
What are the key social and technical barriers to making software and hardware more accessible?
Our foundational technologies were not designed accessibly. Computers were designed with basic form factors—keyboard, mouse and screen—and that basic setup has not changed in 40 years even though there are millions of people who can’t see a screen, operate a mouse or use a keyboard. We have created an accessibility industry to try and correct for that. Intel embraces the philosophy of “nothing about us, without us,” and is committed to centering people with disabilities in our product design and user experience research. If we include people with disabilities in the design of new experiences, we will be more able to design products that are inherently accessible.
What would you tell a researcher, designer, strategist, or technologist thinking, “I wish I (or my team) knew more about accessibility. What’s the first thing I should do?”
Embracing the ethos of “nothing about us, without us” is a great first step. To that end, we all should work alongside people with disabilities and include people with disabilities in our research. You can also address accessibility by practicing Inclusive Design. This means that project teams should ask, “who might we be excluding?” throughout the product lifecycle.
Perhaps the best thing is to hire someone from the disabled community, someone who knows that world and can help their colleagues navigate it. However, here is where someone with an ethnographic practice could also be useful. For more directed assistance, you could have a team member (ideally a researcher and perhaps one with a disability) work with multiple people who require a range of accommodations to serve as their guides.
Can you tell us about the terrific panelists you’ve gathered for our January 26 EPIC Talk, “Seeing Ability: Research and Development for Making Tech More Accessible”?
Darryl Adams is the Director of the Intel Accessibility Office. Darryl leads a team that works at the intersection of technology and human experience helping discover new ways for people with disabilities to work, interact and thrive. He is a blind engineer who learned at a young age that, due to retinitis pigmentosa, his sight would slowly fade. His eyesight did gradually diminish but his vision did not. He saw a future for himself in the technology of accessibility. Darryl is working to help Intel and the rest of the tech community be a better ally to the disabled community. You can read about Darryl’s work with EPIC member Jamie Sherman on the intersectionality of videogames and disabilities, and hear Darryl discuss Intel’s work with GoodMaps.
Tim Graham is a Gaming Segment Lead at Dell Technologies. He is passionate about making gaming more accessible for everyone.
Stacy Branham is an Associate Professor of Informatics at the University of California, Irvine, and Co-PI of AccessComputing, a national initiative to broaden participation in computing to include people with disabilities. Her research yields actionable design guidance and proof of concept prototypes. In a recent partnership with Toyota, she worked with blind and low vision people to co-design a wearable voice assistant for navigating airports. In another partnership with Google, she co-designed accessible profile images of people with disabilities, which now ship on all Chromebooks, marking the first time that people who are blind can choose a representative system profile image for themselves. In 2021, Branham received the NSF CAREER Award and was named one of the “Brilliant 10” rising STEM researchers by Popular Science. In 2022, she was recognized with a campus-wide teaching award as a Digital Accessibility Innovator. http://www.stacybranham.com
Adam Munder is the General Manager of OmniBridge, an Intel Venture. He is enthusiastic about creating a barrier-free world for Deaf/HoH people to communicate with hearing people anywhere, at any time, inside or outside of the corporate world. OmniBridge enables bi-directional, real-time conversations between people who use American Sign Language (ASL) and those who speak English by harnessing the power of AI-driven machine translation technology. A Deaf engineer who collaborated with exclusively hearing colleagues for years, using ASL interpreters to support his communications. He realized that if more people had the support he has, Deaf people could more easily participate in the economy. He also realized that there is a shortage of interpreters; having full time interpreters increases the costs to a company and he sought a solution. Using emerging capabilities in machine translation and machine vision, he and his team at Omnibridge have built an application that can translate a person’s signed sentences into English. This is really the first time that real ASL—signs in sentences—can be translated instead of discrete signs from ASL or just fingerspelling English words.
Related Resources
Hearing Through Their Ears: Developing Inclusive Research Methods to Co-Create with Blind Participants (free article)
Tutorial: Research for Accessible & Inclusive Design (EPIC Member video 45 minutes)
0 Comments