Ground-breaking Web Graphics Technology Launches at CSUN 2022, North America’s Largest Assistive Technology Conference
Anaheim— March 16, 2022 — Today at the CSUN 2022
Imagine trying to visit a web page full of interesting and informative graphics if you couldn't see. You would get the text read out to you with a screen-reader, but descriptions of the images would be limited, at best. That's the situation faced today by millions of blind Internet users. Until now.
At the CSUN Conference, taking place this week in Anaheim, California, the IMAGE project, a collaboration between researchers at McGill University, Gateway Navigation, and the Canadian Council of the Blind (CCB), is announcing the beta version release of their free web browser extension. This extension provides users with a rich audio sonification, and optionally, a tactile representation, of several types of web graphics.
The IMAGE project arose from challenges faced by blind users trying to navigate the Vancouver Convention Centre using an online floor plan of the building. As David Brun, founder of Gateway Navigation explains: “The online plan was a photograph; so, the directional information it offered to sighted visitors was not accessible to me. This is a challenge faced by millions of people worldwide every day.” Now, with the initial release of IMAGE, as an open-source tool designed and developed by over twenty researchers from McGill’s Shared Reality Lab and over fifty co-design participants who are blind, deafblind, or partially sighted, from across Canada, we are witnessing a first step in unlocking the barrier to accessible internet graphics for everyone.
Jeff Blum, Technical Project Manager for the IMAGE project explains, “For example, by using spatial audio, where the user experiences the sound moving around them through their headphones, information about the spatial relationships between various objects in the scene can be quickly conveyed without reading long descriptions. In addition, rather than only passive experiences of listening to audio, an optional haptic device can help the user literally feel aspects like regions of a landscape, objects found in a photo, or the trend of a line on a graph. This will permit interpretation of maps, charts, and photographs, in which the visual experience is replaced with multimodal sensory feedback, rendered in a manner that helps overcome access barriers for users who are blind, deafblind, or partially sighted.
Prof. Cooperstock, Director of the Shared Reality Lab, elaborates, “Our project is designed to be as freely available as possible, as well as extensible so that other technologists, artists, or companies can produce new experiences for specific graphical content that they know how to best render. For example, if someone has a special way of rendering stock market charts, they do not have to reinvent the wheel, but can create a module that focuses on their specific audio and haptic rendering and plug it into our overall system.”
Brun continues, “This Collaboration is possible through the funding made available through Innovation Science Economic Development Canada and the guidance of the Canadian Council of the Blind (CCB) voice of the blind in Canada, Dot Inc from South Korea and Haply Robotics from Montreal, Canada.
We look forward to collaborating, partnering, and supporting all users, researchers, developers, and organizations interested in building on IMAGE’s open-source design and commitment to creating greater accessibility on the internet for everyone.
To learn more about IMAGE and how to financially support IMAGE’s on-going work, visit: https://gnc3.com/go-image
For more information,
PR Contact Name: Jeremy Cooperstock and Jeff Blum
Organization: McGill University’s Shared Reality Lab
Phone number: 514.558.3953
PR Contact Name: David Brun
Organization: Gateway Navigation
Phone number: 604.499.4818
For more information on IMAGE – Chrome Extension AI Accessibility at Work: