Augmented Reality is usually associated with the overlay of digital virtual images onto the user’s environment. However, this understanding of AR disregards the non-visual aspects of our experience and perception of reality. Sound for example is currently often used as an important channel in navigational systems and museum tour guides. As part of her PhD research into interaction models in interactive art, Hanna Schraffenberger investigates non-visual augmented reality art. A few examples of interesting existing non-visual AR systems.
Audio Space (2005) is a 3D interactive sound environment by Theodore Watson. Users wearing a headset can leave sound-messages at any position within the augmented aural space and listen to the sounds left by previous visitors as they move within the space. In later versions of the installation the voice is transformed into sonic structures which “create a rich and layered sonic environment.”
RjDj is an iPhone App which produces musical remixes of environmental sounds in real- time. The user is wearing usual headphones or earplugs. The sounds of the environment are picked up by the built-in microphone of the phone and then are remixed in real-time. The user can furthermore select among many different scenes (comparable to musical genres) which determine the way in which the sounds will be remixed. Users can also contribute by creating their own scenes.
Toozla is a mobile service that combines a global positioning system with audio tours and stories, user content, and local information. It provides an all-in-one travel guide which can be taken along everywhere.
Antal Ruhl investigate the effects of head-movement dependent GVS on daily activities. His constructed device, based on the head orientation of the user, outputs low current electrical pulses to alter one’s balance. The current makes people fall to the left or right involuntarily. The setup can improve balance performance or give the illusion of being in a different medium (e.g. under water, outer space).