Dark Patterns Hinder Accessibility for People with Disabilities

The annoyingly inherent part of accessibility to most internet portals, websites and platforms is the perennial submission of proof that you are not a ‘bot’ but a human. So, you are constantly intruded with questions like “spot the traffic lights in the pictures” or something similar to prove you are not a machine, manipulating your way in. These standard processes that we have accepted as a norm, may be a bit of an irritant to us, but how would it be with someone who is visually impaired.

CAPTCHA security tests, or the “Completely Automated Public Turing Test, to Tell Computers and Humans Apart” are systems built to fail when taken by people with disabilities, denying them access, unless a helping hand is not far. Called “Dark Patterns” (British coined term meaning it manipulates users) they are designed with a certain level of insensitivity to the blind who are equally entitled to access. Accessibility advocate Chancey Fleet, terms them “encoded inhospitality”.

As a result of this monolithic safety feature, the legally blind citizens of the country do not enjoy full participation of the online economy, which is their right. Experts partly blame Australia’s Disability Discrimination Act, which first passed in 1992, the Act is not explicit enough about technological accessibility.

A few inclusions and changes would benefit the visually impaired. Ms Fleet suggests that while devising and testing these techniques more people and every segment and diverse group should be considered and included. “Tech culture — from platforms to procurement to education — must shift away from focusing on accessibility when a person with a disability presents a need, and shift toward treating accessibility as a consistently required part of every product,” she said.