Bridging the Disability Divide with AI and Brain Computer Interfaces

AI's transformative role in inclusive design.

Apr 2, 2024

-

5

min read

An illustration of 5 people with different disabilities
“Disability results from the interaction between people with impairments and the attitudinal and environmental barriers that hinder full and effective participation in society on equal basis with others.”

The above statement is by far the best definition of disability I have encountered — it addresses the idea that disabilities are not inherent to people, but the quality of the interaction that can enable them. Looking at this through a designer’s lens, this isn’t just a concise statement but fuel to see the potential an individual can have when an environment is designed to accommodate them. It sets a challenge to be the person that provides innovative solutions for inclusivity. This is exactly how Ed Summers, Head of Accessibility at Github, is bringing down barriers for users. AI is the buzzy term none of us can escape, and in the world of accessible design and technology it’s making waves. Summers admits AI has drawbacks for the disabled community, one of them being the rush to implement the novel technology into every software, phone and toaster out there is leaving out integral design steps for accessibility and widening the disability divide in the race to be AI oriented. But for the most part, Summers is optimistic about the remarkable impact AI is having on technology and the implications it has for the 1.3 billion people living with disabilities around the world.

The app Be My Eyes, was designed to help people with blindness and low vision to get a better grasp of their surroundings. It connected visually impaired individuals to volunteers who are granted access to their phone camera who can then describe the scene in front of them. This could be used if someone had dropped their keys, or if they were struggling with which side of the street a coffee shop was on. Or in the case of the fictional thriller See For Me, an equivalent app is used to get the main character out of a terrifying home invasion — definitely worth a watch if you like that kind of spine tingling suspense. The drawback to this app is its reliance on someone answering the call for help and being able to effectively describe a scene. Some users felt they didn’t want to bother anyone with their problems and weren’t using the service. Access to open AI however enables chatbots to interpret and describe scenes very accurately; Be My AI allows people to take a photo and have a detailed description reported back to them. Darryl Adams, a lead in accessibility at Intel, said the intricate detail described by the AI powered app was enabling a feeling of ‘connectivity he didn’t know he was missing.’

AI’s incredible ability to predict intent is also enabling individuals with motor function impairment to complete tasks more efficiently. Anton, a software developer living with cerebral palsy, demonstrated how his disability impacted his ability to code. Anton can only press one key at a time and if he wants to press more he has to use his nose, making coding difficult and slow. With Github co-pilot, the predictive element allows him to write code with far more ease. He described the technology’s ability to predict his next steps in his code as ‘reading his mind’. Co-pilot’s predictive feature is speeding up developers all over the world, proving that inclusive design can enable and empower people more broadly.

Joe Devon, founder of the accessibility agency Diamond, eloquently identifies how the very nature of AI is intrinsically connected to disability. AI understands sensory input, engages in cognitive processing and produces generative output. When this is aligned with someone who struggles to engage in cognitive processing and with generating communicative output, the technology can step in and provide a more seamless interaction for the individual. BCI’s, — Brain Computer Interfaces — are an astonishing example of technology reshaping interactions for people with disabilities, and these are being radically impacted by AI’s generative capabilities. BCI’s are computers connected directly to the brain to analyse and decode neural activity which can be used to control personal devices through thought alone. This could be to answer a phone, type on a computer or even move a robotic limb. The introduction of AI to these interfaces is making it faster and more efficient for computers to predict the needs of users. The technology is exclusively being trialed on people with significant disabilities and the majority of BCI’s involve invasive procedures to access the areas of the brain to harness the relevant neural signals. These procedures can lead to brain bleeds and haemorrhaging, but the patients signing up for this have such debilitating illnesses that the benefits are considered to greatly outweigh the risks. Patients experiencing full body paralysis and an inability to communicate verbally or through sign language can significantly increase quality of life through these interfaces.

The ever controversial Elon Musk has been famously venturing into the world of brain controlled tech; his company Neuralink has garnered notoriety because of its BCI trialing. Early this year, Neuralink successfully implanted its hardware into its first human patient, the most significant step towards its goal of becoming the first company to make these kinds of interfaces commercially viable. However, these human trials come after Neuralink attracted FDA attention through its animal testing phase, which led to dozens of chimpanzees having to be euthanised after the technology was implanted into the primates. Another noteworthy company implanting interfaces into humans is Synchron, which has designed a minimally invasive procedure involving tiny stents implanted into the motor-cortex to help individuals who have lost physical independence. As BCI’s develop with the turbocharge of AI, thoughts themselves could be interpreted and uploaded into machines and clouds in the not too distant future — meaning we might have to start being careful about what we think, as well as what we say. The monumental privacy issues that come with the advancement of this technology lead to a call on companies like Neuralink and Synchron to ensure the technology is carefully audited and open sourced to protect the integrity of people who are trialing the technology, and hopefully not sold to Facebook to influence the type of adverts we see while scrolling through memes.

The trend we’re seeing is clear, when people with disabilities are included in the design process and truly catered to, everyone can benefit. However, the majority of design typically favours traditionally abled people leaving accessibility as an afterthought, resulting in products often having to be awkwardly shifted later in their design cycle to accommodate the needs of people with disabilities in sometimes costly and ineffective ways. BCI’s are fascinating because the sci-fi like technology is exclusively addressing disability. The development of this technology is enabling people to participate meaningfully in a world that initially did not accommodate them and has implications for people without disabilities too. The future of this technology will be personal AI generating output in real time in the format you need it to experience the world around you in a frictionless way. For people who experience the world with debilitating friction, the impact of this technology will be profound.

At Spinning Fox, inclusivity is at the core of our design philosophy. We believe that every user, regardless of their abilities, should be considered when designing digital experiences that are intuitive, seamless, and empowering. By embracing the principles of inclusive design, we strive to create solutions that cater to the diverse needs of all individuals. We aim to to understand the unique challenges faced by users with disabilities and actively work to remove barriers to their participation. We are able to develop interfaces that adapt and evolve to meet the specific needs of each user and aim to be a part of the meaningful change that bridges the divide between able and differently abled communities.

Illustration by Marie-Christine Gervais

About
the author

Jacob Pulley

UX Designer

Jacob is a UX Designer with a passion for solving customer problems. Jacob has a background in the retail and fashion industry where he worked with Louis Vuitton and was a winner of the British Airways 2023 hackathon.