AudioCubes are a collection of wireless intelligent light-emitting objects, capable of detecting each other's location, orientation, and user gestures. They were created by Bert Schiettecatte as electronic musical instruments for use by musicians in live performance, sound design, musical composition, and for creating interactive applications in max/msp, pd and C++.
The concept of AudioCubes was first presented by Schiettecatte in April 2004 at the CHI2004 conference in Vienna.[1] An initial prototype of AudioCubes was shown at the Museum for Contemporary Art, MUHKA in Antwerp in December 2004. AudioCubes were also featured in an art installation created in collaboration with Peter Swinnen during the Champ D’Action Time Canvas festival.[2]
In January 2007, AudioCubes were released to the commercial market[3] and offered for sale on the website of Percussa, a company Schiettecatte founded in October 2004 to promote AudioCubes.
Each AudioCube is identical, and has a small built-in computer which is able to detect the position and location of the other cubes in a network and measure the distances between them. AudioCubes also work without drivers and communicate using high speed HID.
An AudioCube has four onboard infrared sensors (one on each horizonal face) to communicate and measure distances to objects nearby, digital signal processors (DSP), a USB-rechargeable battery, and a translucent housing.
The bottom and top sides of each do not have any such sensors. Thus any distance driven arrangement of cubes is limited to a single horizontal (e.g. table top) level of interactivity. That is, the cubes are not typically stacked on top of each other.
AudioCubes work wirelessly with MIDI-compatible software and hardware (e.g. FL Studio, Logic Pro, Reason, drum machines and Monome). A connection must be established using middleware such as MIDIBridge.
AudioCubes also come with an Open Sound Control (OSC) server to send and receive OSC data.[4]
Several applications have been created for use with AudioCubes, each with a different focus, such as sound design, music composition, live performance, or for creating applications in max/msp, pd and C++.
In addition, a number of Max/MSP patches were created to work with AudioCubes.
The AudioCubes can be used to send MIDI notes to MIDI compatible software/hardware using MIDIBridge. When two AudioCubes are put next to each other, they detect each other, and triggers are sent as MIDI notes. These triggers can then be used to control on/off type of signals, such as start and stop audio clips in a Digital Audio Workstation (DAW), such as Ableton Live. At each face of the cube, a different audio clip can be assigned in a DAW.
The AudioCubes can also measure distances to nearby objects or your hands when configured as a sensor cube in MIDIBridge.[5] In the same way, this sensor data is sent to the computer as a continuous controller (CC) which can be used to control parameters in the DAW. Since each cube has 4 sensors, up to 4 parameters can be controlled per AudioCube.
In addition, you can also control the RGB colors of the AudioCubes and use this information as feedback during a live performance.
The sensors of the AudioCubes can also be used to shape sounds. By moving hands and fingers closer or further away from the 4 sensors, it generates 4 different MIDI CCs which can be sent to MIDI compatible instruments. When using an AudioCube in this way, it can be compared to a 4D optical theremin.
The AudioCubes can also be linked to LFOs by using the software application Evolvor. The LFO waveforms are designed in the graphical editors of Evolvor. Each AudioCube is then automatically linked to an LFO, because of the topology detection. LFO signals can be added and removed, by adding and removing AudioCubes. The signals can also be mixed and matched, by mixing and matching AudioCubes.
When using the Improvisor application, velocity as well as semitone patterns are automatically linked to every AudioCube. Every AudioCube plays the melody created by both patterns. When cubes are placed next to each other they can follow the same melody. In this way, music can be composed by mixing and rearranging AudioCubes.
Several tools have been created to make applications for the AudioCubes in max/msp, pure data, and C++.
AudioCubes have been used by some performers such as Mark Mosher,[6] Pearls for Swines, Richard Devine, Steve Baltes, Bostich from Nortec, Ilan Kriger, Arecio Smith, Julien Pauty, and the European Bridges Ensemble.[7]
AudioCubes are an example of a Tangible User Interface. In the past few years a lot of research has been done in the field of Tangible User Interfaces. The Reactable is another example of such an interface. It is an installation on which people can move around objects which are followed by a camera and projector on a surface.
For the creation of the AudioCubes, Bert Schiettecatte received in 2009 the prestigious Qwartz Electronic Music Awards in Paris. He was also invited to give a talk at TEDx Mediterranean in Cannes, September 2010.[8]