S.A.R.A. - (synesthetic augmented reality application) is an App exploring the potential of using a mobile device as a unique musical interface via a digitally augmented synesthetic experience.
S.A.R.A. translates the surrounding environment into sounds based on the camera input and the movement of the device. It uses generative synthesis to produce the sound, please check the help section on how to load your own pure data patches. SARA uses Pd files to convert the pitch/roll/yaw and/or camera video color values into sound. You also have the option to upload your own Pd file using iTunes which can be selected in the Pd Settings.
In 2012, //benitez_vogl (Margarita Benitez and Markus Vogl) were awarded a National Endowment for the Arts – New Media Artworks Grant to develop S.A.R.A. See benitezvogl.com and s-a-r-a.com for concept, history, source documentation and more info.
//benitez_vogl thanks all our supporters.
You can upload your own Pd files via iTunes.
Pd Values Range:
- "pitch": -180.0 to 180.0
- "roll": -180.0 to 180.0
- "yaw": -180.0 to 180.0
- "color": 0.0 to 1.0
- "red": 0.0 to 1.0
- "green": 0.0 to 1.0
- "blue": 0.0 to 1.0
Use the value names in your Pd file to use data.
Original concept, production and technology: //benitez_vogl
Margarita Benitez: Assistant Professor Fashion Design and Fashion Technologist - Kent State University - The Fashion School
Markus Vogl: Assistant Professor Graphic Design Interactive Media - The University of Akron - Myers School of Art
Erik Krupa (2012-2013)
Chris Yanc (2014-present)