Links

All links

AI, accessibility and fiction

This week, once again, someone suggested that “AI” could replace (paraphrasing) normative guidelines (ref: mailing list post of AGWG, the group that produces WCAG).

Eric Eggert explains why this seems unnecessary:

The simple fact is that we already have all the technology to make wide-spread accessibility a reality. Today. We have guidelines that, while not covering 100% of the disability spectrum, cover a lot of the user needs. User needs that fundamentally do not change.

(From: “AI” won’t solve accessibility · Eric Eggert)

I cannot but disagree with Vanderheiden and Nielsen. They suggest (again, paraphrasing) that we can stop making accessibility requirements, because those somehow “failed” (it didn't, WCAG is successful in many ways) and because generative AI exists.

Of course, I'm happy and cautiously optimistic that there are technological advancement. They can meet user needs well, like how LLMs “effectively made any image on the Web accessible to blind people”, as Léonie Watson describes in her thoughtful comment. If people want to use tools meet their needs, great.

But it seems utterly irresponsible to have innovation reduce websites' legal obligations to provide basic accessibility. Especially while there are many unresolved problems with LLMs, like hallucinations (that some say are inevitable), environmental cost, bias, copyright and social issues (including the working conditions of people categorising stuff).