Intuiface accessibility sandbox: an experimental framework to make experiences more accessible

As outlined in this blog post by @geoff, an accessible interactive experience (XP) is an XP that works for as many people as possible. Intuiface offers a variety of features to design inclusive XP, namely Text to Speech (TTS), Speech Recognition, Keyboards Events and Gestures on Assets.

This experiment has been inspired by Microsoft accessibility guidelines, as well as by iOS accessibility implementation. In essence, a touch experience needs to either:

  1. Support at least one alternative input interface. The alternative input interface cannot require the same senses to use it. Or,
  2. Provide feedback via various sensory modes

This demo showcases the use of keyboard input, aural feedback via Text To Speech (TTS) and audio cues, and visual feedback for the elements in focus. All this in co-existence with standard touch and mouse interactions.

Challenges and technical implementation

In building this demo, I encountered a few design challenges:

  • Where to store the metadata that would be read by the Text To Speech (TTS) IA
  • How to read the changing status of certain assets (e.g. a toggle button that is checked) and communicate it back to the user via TTS
  • How to have the same keyboard keys perform different tasks
  • Make keyboard events work seamlessly with mouse / touch input

Ultimately, I settled for a solution that keeps most of the data in Excel, the core functionality in buttons living outside the screen on an experience layer, and inevitably some hard coded conditional triggers at the scene level.

To dynamically trigger object actions through Excel, I used formulas to generate remote commands that would be invoked via the Call URL action. This feature is usually used to control third party apps, but you can target the XP itself.

Triggers XLS

Along the same lines, the TTS string is being formed dynamically to inform the user about the type of object in focus and its status.

This implementation is far from being comprehensive and I applied it only to a handful of Intuiface assets. That said, I hope that the code and methods of this XP will save you some time in your own accessible Intuiface developments.

Special thanks to @Seb and the Intuiface product team for having made available these accessibility extensions to all of us.

4 Likes

Hi Paolo @tosolini!

I took some time to look at how you built this experience and especially how you handled the notion of focus and next/previous action. This is brilliant!
The combination of Simple Counter / Excel filtering / Remote action is something I would have never thought about.

Thanks for teaching me about Intuiface! :slight_smile:

@seb thanks for the kind feedback :slight_smile: I’m glad you liked the solution. We are all constantly learning from each other and this is what I enjoy about the Intuiface community.

1 Like

I’m pleased to announce v2 of the Intuiface Accessibility Sandbox.

Inspired by @Seb Multi-Modal Experience and the recent announcement of the new Reference Designs, I updated the code in my experimental framework to support touch-less interactions.

This version allows:

  • Sequentially moving focus across various interactive elements of a page using keyboard, speech recognition and mobile remote control
  • Visually highlighting objects in focus
  • Text To Speech (TTS) to read alternative text associated to objects in focus (e.g. button, image)
  • Ability to interact with key assets (e.g. normal and toggle buttons, asset flows and video players)

Feel free to use this code in your own projects.

7 Likes