Is it ok to make AI responsible for inclusion?

Using AI for inclusion creates only a facade of inclusion

I often hear a popular use case in AI’s favour, which is that it can enable wider inclusion…

For example, in the world of accessibility, AI can generate automatic ‘alternative’ (ALT) text for images to help visually impaired readers interpret photos on a website. The thinking goes that, by using AI, we can make more of our content accessible. This is being applied both for external content on blogs and websites and also to internal content within enterprises, not an area known for good accessibility today.

On the surface, this all seems like a worthy endeavour. Surely it’s better than nothing, right? But I want us to pause and question this emerging common practice, not just from a practical ‘does it work’ standpoint but from a deeper ‘is it good for us’ point of view. The charity Scope argues for human intervention on the grounds of quality. They give an example of a t-shirt for sale whose ALT text tells you it’s an image of a t-shirt, but not what’s written on the t-shirt — the critical piece of information! A badly captioned image might be worse than no caption at all. 

For me, I see AI as advancing so rapidly that I think poor performance is unlikely to be a problem in the long term — AI is getting simply too good at the jobs it can do. If it can generate images from text, it is no great leap for it to reliably reverse the process.

That takes us on to thought-provoking question: ‘is it good for us’?

And I’m not so sure it is. Let’s consider what I’m really doing when I get the AI to tag the images on my website for me. Am I a better person for having delegated my care for those less visually able to a machine? Does it make me more attuned to the needs of the visually impaired or less so? 

I think less. I think about those people less, I think less of their needs. I fail to consider them in my content. In short, I increasingly neglect them. When I use AI, I am becoming less inclusive. There is this compound effect at play: when we delegate a task to our agents, whether machine or human, over time our own skills degrade — we forget how to do it well. When it comes to maintaining cars, that may not matter beyond costing me more in garage fees, but when it comes to real people, there is someone at the receiving end of my neglect. 

If we all do this, if we scale up AI for inclusion, then as a community we’re putting machines between us and, in turn, we’re all becoming less inclusive. So by assuming ‘AI will take care of that’ when it comes to inclusion, rather than bringing them closer, I am, in fact, isolating those people already on my margins. Using AI to do my inclusion work creates only a facade of inclusion — care outsourced entirely to machines so we can forget to care. AI used in this way deepens the gulf between us.

Attention brings us closer, forgetting drives us apart

The end of this path is no less than separation and making second-class citizens of others. In this case those with disabilities. We must be careful not to let the way we choose to use AI inadvertently help us slip into the totalitarian ideologies that plagued the twentieth century. A root cause is a mistaken belief that it is possible, indeed that is ‘normal’, to be fully able. In fact if we look at ourselves and those around us, we find that we are all disabled in some shape or form, the differences between us are of degree and type. A dodgy hip, a tendency towards feeling low or even a lack of imagination are all small disabilities, but disabilities all the same.

This is a better starting point from which to provide the right support for everyone to participate as fully possible. If someone’s disability tips over into the official definition, the 2.2 billion of us with visual impairment for example, then that just helps us prioritise our efforts where it is most needed! So inclusion is a worthwhile and important pursuit for all humans. It’s not something to delegate to machines, but instead should be something positive we embrace. Our humanity is best expressed in the way we love each other and good inclusion is a practical way to show love. 

You can’t delegate love to a machine, so don’t delegate accessibility and inclusion to AI.

Featured image: Mikhail Nilov / Pexels

Toby Beresford, Director of Digital Strategy at Bible Society

Toby Beresford is Director of Digital Strategy at Bible Society and is exploring how the Bible can positively influence our shared online culture.

All articles