Accessibility on iOS: Meet Machine Learning 🤖

updated 10 months ago

Accessibility on iOS. We know it's important, we know it's the right thing to do - but how you can truly innovate here 🤔?

We've thought about accessibility here and there, but it wasn't until users who were actually visually impaired, blind or mute reach out to us that I truly took it seriously. It's much more than just setting the value of accessibilityLabel property.

So as I set out to think about how we could help these people use our apps better, I never expected to find several ways to help them out with....machine learning. Further, I never thought to look there because:

  • I still use a calculator to do the math for settings frames of my UIViews
  • I thought "convolutional neural network" was an A.I. off of Star Trek
  • ... and the term machine learning is scary, it inherently implied to me that it was an exclusive club of brilliant developers who could recite Big O notation brain teasers in their sleep, the likes of which I would never be capable of doing.

Yet, their worlds collided and I coded them as such. The result was a beautiful happenstance of automatic accessibility wins in our app. Long live the machines!

Suggestions