文摘
In this paper we present a system for real-time computer music performance (live electronics) and live visuals based on the behavior of a fish in an aquarium. The system is comprised of (1) an aquarium with a fish; (2) a computer vision module; (3) a visual display of the fish overlaid by graphical elements controlled by the user, (4) a sound synthesis module and (5) a standard MIDI controller. The musical expression and graphic generation is a combination of the fish movements and decisions made by the performer in real-time. By making use of a live animal, the system provides indeterminacy and natural gestures to the sound being generated. The match between sound and image shows some semantic redundancy, aiming at a more narrative compositional approach where the fish is the main character. The system is targeted to soundscape composition and electroacoustic music featuring a high degree of improvisation.