AI - Friend of Foe for the Disability Community?

AI, or Artificial Intelligence, is the buzzword of 2024 in the tech world. It's quickly becoming an integral part of our lives, revolutionizing computer and mobile technologies. AI promises to make our lives easier by completing tasks faster, more efficiently, and with greater accuracy. From writing emails with a prompt to using voice commands for text creation, and enhancing basic drawings to professional-level art, AI's applications are vast. It's also being leveraged to tackle complex global issues in fields like manufacturing robotics, deep machine learning, and solving advanced problems that have stumped humanity for years.

But with AI's rapid advancement, Woven asks: Is AI a friend to individuals with disabilities, or does it risk perpetuating the biases and prejudices they already face?

AI as a Friend to the Disability Community:

Language Models: Many are familiar with tools like ChatGPT or Co-Pilot. These AI models can generate accurate, curated written responses to almost any prompt. For individuals with word-finding difficulties or fine motor control restrictions, these tools can be invaluable time-savers. For instance, a person might take 20 minutes to compose an email due to their disability. With Co-Pilot, they can generate an appropriate response with just a few clicks, make minor edits, and save significant time while still maintaining professionalism. AI also excels in tasks like number crunching and data analysis, which can be cumbersome for many.

Smart Homes: As discussed in a previous Woven post, AI is enhancing smart home technology. AI can automate tasks such as adjusting lighting, temperature, and opening doors based on specific inputs. Philips Hue lights, for example, can adjust throughout the day to match the time and season, supporting natural circadian rhythms. This makes homes more accessible and comfortable for individuals with disabilities.

Health Monitoring: AI supports health in unexpected ways. Common applications include fall monitors in personal alarms and smartwatches, tracking seizures, fluid consumption, and diabetic episodes. During the COVID-19 pandemic, AI demonstrated its potential on a larger scale. Research by Fuller et al. in 2023 showed that heart rate data from smartwatches could diagnose COVID-19 before a formal test. This ability to predict health issues early could be life-saving for many, provided they are comfortable sharing their data.

Concerns about AI in the Disability Community:

Ethics and Fairness: Can we trust that AI handles our data ethically and without bias? Individuals with disabilities already face discrimination. Developers must ensure that AI tools are free from biases, maintain accountability, and are transparent in their design and implementation.

Privacy Issues: Privacy is a major concern, especially with sensitive data like medical history. AI tools store vast amounts of personal information, and there are risks if this data is not properly de-identified and managed. Facial recognition technology, for instance, raises significant privacy concerns if data leaks.

Biased Data: AI models like ChatGPT and Co-Pilot rely on historical data. If this data is biased against individuals with disabilities, it can perpetuate existing prejudices. AI lacks the nuanced understanding and reasoning of the human mind, making it crucial to ensure the data used is fair and accurate.

Inclusivity in Design: A common mantra in the disability community is “nothing about us, without us.” There are concerns about the level of engagement individuals with disabilities have had in creating and testing AI tools. Excluding them from the design process increases the risk of leaving them behind in the rapidly evolving tech landscape.

-

As a self-confessed tech enthusiast, I see the potential benefits of AI for individuals with disabilities and am excited for the improvements that this technology can have in the daily life of all users. These tools can offer creative and effective solutions to daily occupational problems. However, I'm also aware of the significant concerns and the need for inclusive design and ethical considerations. It's a matter of wait and see, and Weave looks forward to closely following these matters as they develop.

-

References:

Wearable Devices to Diagnose and Monitor the Progression of COVID-19 Through Heart Rate Variability Measurement: Systematic Review and Meta-Analysis - PMC (nih.gov)

Artificial intelligence and disability: too much promise, yet too little substance? | AI and Ethics (springer.com)

Artificial Intelligence: The road ahead for the accessibility of persons with Disability - ScienceDirect

"Automating autism" by Os Keyes (odu.edu)

Disability Discrimination Using Artificial Intelligence Systems and Social Scoring: Can We Disable Digital Bias? Assessing the Promises and Perils of Artificial Intelligence 8 Journal of International and Comparative Law 2021 (heinonline.org)

Wearable Devices to Diagnose and Monitor the Progression of COVID-19 Through Heart Rate Variability Measurement: Systematic Review and Meta-Analysis - PMC (nih.gov)

Previous
Previous

#ImagineWhatWeCanDo

Next
Next

(Micro)Climate Change