UNudio storytelling is an avenue into other worlds. So when the Guardian was approached to take part in an experimental project to make journalism more accessible to low-vision and blind users, it felt like an opportunity we couldn’t turn down. Audio has always been about making stories more accessible, and this was an opportunity to push that even further.
The result is a storytelling website called Auditorial, created to showcase the possibilities of accessible stories for blind and low-vision audiences. The story is our own, paired with Google technology and the invaluable accessibility user-testing and expert advice provided by the Royal National Institute of Blind People (RNIB) – an example of what can be done when inclusive design and thinking are at the forefront from the start.
The website, which was created over a seven-month period, was born out of an episode of our Science Weekly podcast a partire dal 2018. And the story, similar to the original podcast, is based on Bernie Krause, one of the founders of a field known as soundscape ecology. Over 15 or so minutes, we use his story to explore the devastating effects of the climate crisis and other human-induced environmental destruction on the sounds of the natural world, from coral reefs to Costa Rican rainforests.
The Auditorial platform uses an assortment of accessibility features and tools to tell the story, including audio description, multimodal films with video and audio speed control, high contrast, text-only mode, and scale and focus controls. Users can press play to start the story and adjust the audio, visual and written settings as they are taken through the story.
The final product is something we are really proud of. There have been many lessons learned along the way, and our ideas of what we would end up with have changed as the project progressed; we were trying to do something that had never been done before. The hope is that we can go on to apply some of its key tenets to more of our journalism – and encourage others to do the same.
Many low-vision and blind users currently access journalism online through screenreading software, which converts text into audio. This is often done in a synthetic voice and doesn’t always discriminate between essential text and other aspects, meaning the experience can be jarring.
Ma, as with podcasts, when a story is presented in audio, the result is a more immersive experience, where sound design and intonation can add emphasis and emotion, and characters are able to tell their story in their own words. While this won’t be possible for all of our online journalism, it’s something we should consider when thinking about things such as how we label images using something called “alt tags”.
For those using screenreading software, alt tags are essential for letting users know what an image shows. And while most websites – including the Guardian’s – do provide these, they are often written as succinct labels. This can lead to quite a disjointed narrative experience.
So a big lesson for our team was how to make alt tags more descriptive and more in line with the narrative. They should feel part of the reader experience and, if done correctly, should play a role in telling the story to a person using assistive technologies.
An important part of the project was providing visuals to enhance the story for users with low vision, such as light or colour sensitivities.
We addressed this by giving users the opportunity to choose between black and white, yellow and black, and blue and white, which are popular combinations. But Google was also able to introduce light and dark modes – a real game-changer for people who struggle with bright screens.
The Guardian has always been dedicated to digital innovation. When new storytelling formats and platforms emerge, we try to consider how these technologies will work for our audiences, and experiment with them to bring Guardian journalism to life. Auditorial is just the latest iteration of that.
While we continue to try to make our journalism as accessible as possible, there are always going to be things we can improve on. Throughout this project, we have learned so much from our partners at Google and the RNIB about accessibility and inclusive product design – findings we are really excited to be able to share with our readers everywhere.