AI tools for inclusion. It’s not all bad!

Inclusion is a continuous work, by the majority for the minority

Last year I cautioned against delegating the responsibility for inclusion to AI. Good inclusion is a practice for humans to perfect, because by so doing we break down the barriers between all of us, and build a more relationally healthy, equitable society. Handing this vital job to our AI agents can make us think about inclusion less, and as a result, make us less inclusive, not more. However, just because it is not helpful in one direction (from the majority to the marginalised) doesn’t mean that AI can’t help the other way round. When AI tools can give those on the edge, the ability to participate fully with the majority at the centre, then we may yet welcome them.

In this follow-up article I want to highlight some examples of where AI can and should make a difference. These are just a selection but they illustrate the point.

AI, when owned and managed by the person at risk of exclusion, are a powerful leveller:

  • Parroton helps people with speech impediments by transforming their speech into fluent speech. Similarly, RogerVoice automatically captions your phone calls. For anyone with hearing difficulties it brings the world of telephone communication back into play. Speechify does similar but for any text, like a book or article.
  • For the visually impaired tools, like Be My Eyes offer a similar service — instead of reading aloud, this time they describe for you what you cannot see.

It’s not just in communication where AI can help level the playing field; it can also be an aid in the physical world. 

  • Mobility difficulties navigating the home can be alleviated using smart home features. For instance smart speakers like Alexa allow you to control curtains, heating, lights just with your voice.  
  • In healthcare too, many conditions can benefit with AI assistance. For instance type 1 insulin-dependent diabetics can now use an automatic insulin pump, coupled with a continuous glucose monitor to algorithmically address high glucose levels. It’s really not the same as having a working pancreas, but it beats waking up in the middle of every night to take a blood test and inject yourself.

So its not the same, but better than not all. This, I think, is where our inclusion and AI conversation needs to continue. AI tools, like the insulin pump, mitigate but they ultimately still don’t fix. A type 1 diabetic will continue to need constant vigilance, both themselves and by those who care for them. 

If we generalise this: just because someone has a tool that helps them be less excluded, that doesn’t mean those around them are off the hook to continue to find ways to work for their benefit. Indeed, even the new tools often bring new requirements for supporters — an insulin pump cartridge needs replacing every 2-3 days, and the pump must be charged, for example.

How do we sum this up?

I think our song needs to be ‘Don’t Stop Includin‘ to mimic the seminal ‘Don’t Stop Believin’ by Journey (perhaps one for the MediaCat playlist?): inclusion is a continuous work, by the majority for the minority, from the core to the edge, by the mainstream for the benefit of the marginalised. AI tools can help but they can’t replace human care and attention. They can’t do our relational work for us without weakening the relationship itself. Relationship benefit is the crucial point we often miss when we put communication technologies particular between us: the use of AI can get in the way of healthy relationships.

For instance, Instagram mediates relationships so that they skew to primarily visual forms of communication. This can then have devastating effects on some teenage girls, who might be already hyper-sensitive to how they look: the tool can make it appear that their friends, too, only care about their looks. Their character, their personality, the way they help others — this is ignored by Instagram and so unvalued by the teenagers. The medium prioritises the visual, so the relationships shaped within in it skew to this limited subset of human interaction… how we look. AI’s role, in the form of the newsfeed, is to re-enforce this bias by picking and prioritising communications with a strong visual component in the feed. Those that, of course, get the most engagement on Instagram, simply because no other form of engagement can be tracked.

So we must be careful but we can still move on from here. When we use AI tools to bring us together, we must examine them to make sure they are not, in fact, pulling us apart. And the best way to do that is to keep the value of fully present, human-to-human relationships front and centre.

Featured image: Ron Lach / Pexels

Toby Beresford, Director of Digital Strategy at Bible Society

Toby Beresford is Director of Digital Strategy at Bible Society and is exploring how the Bible can positively influence our shared online culture.

All articles