Circuit Kisser at Juilliard (2019)

As part of the Center for Innovation in the Arts at Juilliard’s Beyond the Machine program, we invited Juilliard alum, composer and jazz bassist Dan “Chimy” Chmielinski to bring his band Circuit Kisser to Juilliard to play a concert. The band uses a variety of vintage analog synthesizers to create their sound, including fiery EWI playing by saxophonist Chase Baird. I’d had a long association with the band (I mixed their first record, as well as one of their first live shows ever), so it was great to bring them to the festival and let them experiment with multimedia and surround sound.

For this program, the folks at Sunhouse generously lent us a set of their Sensory Percussion sensors, which allow a user to extract a lot of interesting data from drumming. Unlike regular sensors, these can detect where the drummer strikes the drum head as well as what type of attack is used (e.g. cross stick, rim shot). When combined with data that I could extract from the audio of the band (e.g. spectral envelope, amplitude, relative levels between sources), I had a rich variety of information at my fingertips, crucially including sharply defined onsets and strikes. With that data, I built a multimedia performance environment for the band in Resolume. Using some customized Max/MSP patches, I created projection design that dynamically reacted to different elements of the musician’s playing. The general architecture was that the Sensory Percussion tools would ingest and parse the drumming, Max would ingest and analyze audio signals, and then Max would do any analysis needed on that data and route said analysis where it needed to go in Resolume. For example, drummer Diego Ramirez’s playing could trigger various video clips, or cause deformations in 3d objects projected on the screen, modulate color or saturation, or a variety of other processes dependent on the song and section. One of the challenges with creating a dynamic system like this is to create something that naturally responds to the performer’s playing, without requiring the performer to do unnatural, unmusical things to achieve a desired interactive effect. To that end, rather than relying too heavily on parameters like position on drum head, I tended to compute more “musical” information. Relative density of snare/tom hits above a set threshold over a set period of time is an example that I found particularly useful for tracking the level of intensity of a drum solo, for example. Other derived parameters include regularity of kick hits (a stand in for syncopation), as well as overall amplitude of individual members of the ensemble. Since I had clean, bleed-free access to analysis of each of the drums’ signals, I was able to use Diego’s playing of different drums to control different things on different projection surfaces. In a sense, for much of the show, Diego was not just playing drums, but playing the projection.