April 19, 2024

[ad_1]

When the Americans with Disabilities Act (ADA) was enacted 32 years ago, there was optimism that technology could close the educational gap for students with disabilities and other special needs. The ADA went far beyond visible disabilities, promising life-changing protections for the neurodeviant.

We, as neurodeviant people, know how educational technology can change lives—and how word processors, spell checkers, and self-paced learning can let our brains thrive in ways that traditional education never could. But we also see how emerging technology threatens to do the opposite, making school a harsher, less accessible environment.

Today, schools across the country are increasingly turning to technology tools that harm students with invisible disabilities. Crude risk assessment tools perceive neurodeviance as harm to self and others. Social media watchers rate posts about mental health, and penalize students who need psychological evaluations as part of their personalized learning assessment.

Remote and computer monitoring programs with biometric tracking capabilities have become a mainstay during the COVID pandemic. These programs flag students for cheating when they look away from their screens or make other “suspicious” movements. This poses a real risk to disabled people. A student with a disability’s vocal and facial expressions may differ from the “normal” baseline against which a software program compares the student—mislabeling their emotion and singling them out for discipline.

In many cases, remote monitoring programs do nor attempt to accommodate disabilities—denial of examinees bathroom breaks, time away from their computer screen, scraping paper and dictation software. This exacerbates disabilities, causes stress, and forces examinees to rush through the most important tests of their lives.

This monitoring drives neurodeviant students into the shadows, preventing them from sharing their feelings, relegating them mental healthand reducing their willingness to seek help.

These algorithms crudely decide who is and who is not “normal”, punishing students simply because their brains act differently…

Seeking cognitive assessments and talking openly about mental health should be encouraged as healthy behavior, not punished. Like many with learning disabilities, we remember being driven from therapist to therapist, assessment to assessment, desperately trying to uncover the correct diagnosis. We remember the sting and stigma when teachers singled us out for our spelling, reading, or inability to sit still.

And we are not alone.

Over 20 percent of Americans have a mental illness, and around 10 percent have a learning disability. For almost all of us, neurodivergence is nothing to worry about, but school surveillance technology treats our differences as a threat. Like the shame we felt when teachers singled us out, it harms students when surveillance technology targets neurodeviant.

Rather than being some magical crystal ball, the algorithms used by schools represent little more than bias in a box. These algorithms roughly decide who is and who is not “normal”, punishing students simply because their brains act differently. But the injustice does not stop there.

Making matters worse, there was one boom in biometric policing technology over the past 30 years, and the same tools used by police in public are making their way into classrooms.

For example, emotion recognition and attention detection software tracks students’ biometric information (e.g., motor and vocal tics) and then compares it to a trend line of behavior considered “normal” or favorable in a problematic effort to track emotions and student focus.

Some EdTech The software already includes this technology. In 2017, a French school introduced the Nestor EdTech platform into its classrooms. the program is equipped with attention tracking capabilities. And in April, Zoom Considers Adopting Emotional Intelligence (AI) on its platform, which is heavily relied upon by distance learning educators.

We are no strangers to the harmful effects of remote monitoring and computer surveillance.

We rushed through the exam because our software didn’t allow for bathroom breaks. We’ve experienced heightened anxiety that biometric tracking software will flag our uncoordinated eye movements, auditory processing habits, fidgeting, and uncontrollable twitches as “cheats.” We were told to participate in exams for an excessive amount of time, up to 10 hours a day for two days. And we chose not to seek accommodations for important tests or not participate in those tests at all because the disability adjustment process was too burdensome. In some cases, this has affected our educational choices and cost us job opportunities.

Thirty-two years later, the full promise of ODA remains unfulfilled. Worse, civil rights protections just seem to be falling further behind.

As we look to the coming decades, lawmakers and regulators cannot simply rest on their laurels. Those in power have no excuse for ignoring the threat, and those who design technology have no excuse for ignoring how their tools can negatively impact people with disabilities. We need protections for the algorithmic age—a new set of ADA safeguards that protect students from the ever-evolving barriers to public life.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *