As part of a project that I am involved with, I am currently trying to design some hardware. It is mainly casing around the components but it is completely new experience for me. I have never tried building anything for a person with visual impairments before, let alone a user experience. It is certainly a steep learning curve.
I have been looking at the Royal National Institute for the Blind’s website and their range of accessibility training. It seems to go from building to website and application accessibility. Yet it does not quite appear to have guides to building buttons or sensors.
Searches for universal design for accessibility pointed me towards the Usability First website with the principles of design. The Centre for Excellence in Universal Design’s page on public access terminals is the best site I have come across so far.
It seems paradoxical but there it is.
Having started this post a couple of weeks ago, the project has moved to a pure software solution for various reasons. Yet the same issues apply: how does one test the out interface as a person with visual impairments?
So far we seem to be looking at manual testing and then using feedback, perhaps with some live coding for a rapid feedback. It would seem to be a “blind” spot in terms of development. Robotics comes up as an idea to solve the issue but that raises other issues but the sustainable approach is an important one.
This something that may come up in my talk at the Research Software Engineer Conference. This is ongoing work though.