Skip to main page content
Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
, 3, 2055668316674582
eCollection

PictureSensation - A Mobile Application to Help the Blind Explore the Visual World Through Touch and Sound

Affiliations

PictureSensation - A Mobile Application to Help the Blind Explore the Visual World Through Touch and Sound

Michael Banf et al. J Rehabil Assist Technol Eng.

Abstract

We present PictureSensation, a mobile application for the hapto-acoustic exploration of images. It is designed to allow for the visually impaired to gain direct perceptual access to images via an acoustic signal. PictureSensation introduces a swipe-gesture based, speech-guided, barrier free user interface to guarantee autonomous usage by a blind user. It implements a recently proposed exploration and audification principle, which harnesses exploration methods that the visually impaired are used to from everyday life. In brief, a user explores an image actively on a touch screen and receives auditory feedback about its content at his current finger position. PictureSensation provides an extensive tutorial and training mode, to allow for a blind user to become familiar with the use of the application itself as well as the principles of image content to sound transformations, without any assistance from a normal-sighted person. We show our application's potential to help visually impaired individuals explore, interpret and understand entire scenes, even on small smartphone screens. Providing more than just verbal scene descriptions, PictureSensation presents a valuable mobile tool to grant the blind access to the visual world through exploration, anywhere.

Keywords: Visually impaired; accessibility; computer vision; human computer interaction; mobile computing.

Conflict of interest statement

The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Figures

Figure 1.
Figure 1.
An illustration of our barrier free user interface design principle. Two-finger swipe gestures are used to navigate through the different modes of the application.
Figure 2.
Figure 2.
(a) The PictureSensation application in exploration and (b) colour training mode in comparison with (c) the original desktop based research protoype.
Figure 3.
Figure 3.
(a) The audible colour space representation as first proposed by Banf and Blanz. MIDI instruments represent opponent colours. (b) The novel sonification model used in PictureSensation. Opponent colours are represented by complementary sound characteristics.
Figure 4.
Figure 4.
(a) Predicted man made structures (white squares). (b) Colour-coding of sonified orientation of edges.
Figure 5.
Figure 5.
Object regions, audified using auditory icons.
Figure 6.
Figure 6.
Image set used for evaluation. Results are given in Table 1.

Similar articles

See all similar articles

References

    1. Adams D, Morales L and Kurniawan S. A qualitative study to support a blind photography mobile applica- tion. International Conference on PErvasive Technologies Related to Assistive Environments. Rhodes, Greece, 29–31 May 2013, pp.1–8. New York, NY: ACM.
    1. Banf M and Blanz V. A Modular Computer Vision Sonification Model for the Visually Impaired. International Conference on Auditory Display (ICAD 2012). Atlanta, Georgia, 18–21 June 2012.
    1. Banf M and Blanz V. Man made structure detection and verification of object recognition in images for the visually impaired. International Conference on Computer Vision / Computer Graphics Collaboration Techniques and Applications. Berlin, Germany, 6–7 June 2013. New York, NY: ACM.
    1. Banf M and Blanz V. Sonification of images for the visually impaired using a multi-level approach. Augmented Human International Conference. Stuttgart, Germany, 7–8 March 2013, pp.162–169. New York, NY: ACM.
    1. BlindSquare. http://blindsquare.com (2015, accessed November 2015).

LinkOut - more resources

Feedback