The initial version of the prototype, called CymaSense, creates Cymatic shapes dependent on notes triggered by the client and will be situated in either the centre of the screen, or on either side, dependent on using the single or the 2-user version. The aim of using multi-sensory feedback by audio-visual stimulus is to encourage the client to explore play within a musical context. This can happen through the use of a microphone, a MIDI instrument or as a shared interactive surface designed to encourage musical and social interaction.
The prototype is currently being tested and evaluated in a number of ways within an on-going study with autistic clients: from projecting it on the wall – a microphone can be placed in the room and basically pick up the sound of whatever instrument is being played and translate that into the Cymatic shapes relating to the pitch, volume and tone; another really popular way of using it is with an interactive table built by the Sensatronic Lab project (http://www.sensatroniclab.co.uk/). The idea is that a Perspex table top with a contact microphone attached can pick up the vibrations and project them back underneath it, making it appear like an interactive table.
For a few YouTube video links of the prototype in action, see below. The first is a screen recording of some basic experimentation with a MIDI keyboard, the others demonstrate CymaSense being used with the interactive table. Videos with vocal experimentation will be added in due course.
The first links are brief demos of CymaSense used with a couple of simple keyboard sounds – monophonic visualising only one Cymatic shape at a time, polyphonic visualising many shapes simultaneously. For better resolution, please watch in HD.
Click the YouTube links below for clips of the CymaSense prototype used with the interactive table: