Thus far, the feedback to the performer consisted of the output sounds and the visual functioning of the Max/MSP patch on-screen. Another area of development is to provide a more comprehensive visual representation of what's happening in the piece sonically for the performer to respond to (e.g. via IRCAM analysis tools and Jitter). We considered the option of addressing the balance between the information going from performer to computer and vice versa. A visual feedback for an aural event raises many questions, but given that so many 'reifications' in music are visually oriented in Western music (e.g. the score and gestural cues) it is considered a relevant direction of further research. Kanno has performed James Wood's Autumn Voices (2001) for violin and live electronics, and Alwynne Pritchard's To the Ground (2005) for e-violin and electronics, so has relevant experience in this area.