Developed in collaboration with Arto Lindsay with recent performances in Marfa, Texas and Brooklyn, New York, this distributed multichannel soundsystem takes live instrument input signals and process them through Ableton/MaxMSP and output through 12 discrete channels for polyrhythmic distribution of sounds in time and space.
Design and programming of hard/firm/software for networked control, sensing, and sequencing of pneumatic sculptures by Chico MacMurtrie at the Muffathalle in Munich. Each piece has its own Raspberry Pi which handles network communication and runs local control algorithms via an Arduino MEGA with specially designed PCB shields operating the valves and reading the sensors. Puredata is used to translate UDP network data to/from standard MIDI for master sequencing in Ableton Live through each piece’s custom Max For Live device.
Electrical design, firmware and PCB breakout board for PID arduino control of the 24V actuators that move the panels behind the kinetic wall.
Developed and built for the Tsai Art and Science Foundation, this ACU is an updated version of Tsai’s original design which takes audio input from a microphone and translates it into a control voltage that precisely modulates the frequency of a strobe light synchronized to the RPM of the motor driving a kinetic sculptures.
After analyzing the original analog CMOS circuits, I developed a hybrid design that employs the same analog input and output stages, but uses an arduino to handle the analysis and timing stages of the circuit for improved range and stability.
Like Tsai’s originals, this version has it’s power supply built in, 1/4″ jacks for the microphone input (which bypasses the built-in microphone) and strobe output, two trimpots for fine adjustment and a large knob for microphone sensitivity.
A custom microphone installation into Michael Evans’ amplified snare drum. The microphone can be fed back into the internal speakers through the resonating skins and combined with contact microphones for a wide range of sounds that can be manually manipulated.
Drums on debut EP from the Zolephants.
“The Zolephants are a protoplasmic-surf-spaghetti-western band from the foothills of Brooklyn. They combine deep-space synthesizers and electronics with silver-surfing baritone guitars and cathartic drumming.”
This is a pedal that I made to control the motor speed of the fans in a vibraphone. It’s designed to fit any Musser frame’s motor and pivots side-to-side so that the player’s full weight is supported, leaving the other foot to press the sustain pedal. The fans move slower when turned to the left, faster to the right, freeing both hands to play as the vibrato is increased or decreased.
A track on which I played percussion with Nettle last February has been released on this new Rebetika compliation from Soundeyet.
A suite of free audio software that I designed and developed with DJ/Rupture and Rosten Woo dedicated to exploring non-western & poetic notions of sound in interaction with alternative interfaces. This video shows them in action. For more info and free download, is.gd/sufiplugins We are working on porting these devices to the VST plugin format and will be setting to work on a piece of Sufi Hardware in the near future.
I’ll be playing the drums for this, a live through-playing of Brian Eno’s brilliant 1974 album Here Come the Warm Jets.
A screenshot of the mute and fader automation software I developed for James Murphy’s Oram console (with a master section by Purple Audio) at DFA’s Plantain Recording House. This stand-alone Automation sequencer gets it’s sample-accurate sync information from Logic, Pro Tools, or any supported audio sequencer via ReWire. The sample count provided by ReWire is downsampled (according to the sample rate of the session) to 30 frames per second and a serial count of these integers provides the timeline on which events are recorded. This sequencer features both a static mode for simple snapshot recall and a dynamic mix mode with read, write, touch, and latch modes for fully automated analog mixing. Recent additions include graphical ‘pencil’ editing and a subgroup/master fader automation trimming.
After trying and trying to come up with a bow wheel that could maintain effective friction on a surface while isolating the noise of the motor and mechanics from the acoustic resonating chamber, I decided to investigate making an electromagnetic bowing device, like the E-bow. Pictured above is the prototype for such a device. A physical oscillator. There are two coils around AlNiCo (Aluminum Nickel Cobalt alloy) magnetic poles, one a pickup and the other a driver. These are wired to the input and output of an audio amplifier IC and fed back into one another through the spring steel tongue (pictured here with a piezo element under it’s bridge terminating at the 1/4” jack). A digital potentiometer regulates the amount of voltage driving the circuit for dynamic control. Staccato articulation can be achieved by instantly reversing the electrical polarity of the driver to stop the vibration. The gain of the audio amplifier goes quite high, all the way to the 12V rail supplying it and reads as a square wave on a scope even at low gain settings. Because the steel tongue, like a speaker cone in a back-feeding guitar amplifier, is physically unable to jump to the +/- DC poles and has to ‘slide’ to them, a near-perfect sine wave results.
Batucada is a pattern-based percussion sequencer built in MaxMSP and designed for live performance using a PowerBook. Controlled mostly by the ‘qwerty’ keyboard, Batucada allows the performer to improvise with patterns and ‘grow’ new ones live. It features variable-division repeats with the right hand, while the left hand selects the instruments to which the repeats will be applied. Mute and Solo, like repeat are applied with the right hand to instruments selected with the left. Signal-based timeline warping (‘swing’) and offset can be handled either globally or locally, allowing different instruments to have their own swing and/or offset settings. Batucada can run as either a master sync device or can be slaved to incoming MIDI time code, MIDI beat clock, ReWire, or a traditional audio click track. In addition to MIDI note commands, Batucada can send 7 and 14-bit MIDI LFO information as well as 2 +/-5V control voltages for interfacing with analog synthesizers and processors. Within the software, I’ve implemented ‘VeLFOs’ (7-bit scalable triangle/sine/square wave LFOs) synced to each track (with individual offsets) for governing velocity, but this data can be mapped to anything from filter cutoff to event execution probability. Timing is coupled to the sample rate by the scaling of signals and is only subjected to Max’s scheduler when turned into MIDI note messages at the output.
Development of the above pictured MSP version is frozen and I am currently working on a new build of Batucada in Pd utilizing Pd’s < sample-accurate event scheduling and driven by a clock which will also implement Euclidean GCD/LCM-based counters for creating rhythmically compelling math-beats. Most all performance functionality that inspired development of Batucada has been eclipsed in recent versions of Live.
Miniature, modular instruments designed to affix to virtually any structure thereby allow the composer musical control of anything from a battery of specially designed instruments to structural surfaces within pre-existing architectural space.
With an emphasis on simplicity, each of these mechanisms’ design usually consists of only one electromechanical actuator (a rotary motor, or linear solenoid) which responds to varying degrees of supply voltage remotely regulated by a microcontroller. This single-actuator design philosophy demands that all mechanical movement within the instrument be subordinated on the physical capabilities of the lone motor or solenoid employed and, while this may sound like a limitation, such use of mechanical design (as opposed to more ‘intelligent’ electronic design) manifests a reliability, mechanical consistency and modularity that would otherwise not be possible.
Each device can be fitted with a variety of harnesses for mounting and is connected to the brain (box containing the PIC microcontroller and DC power supply) via a single run of cable. Thus, the microcontroller administers the appropriate voltage to hit, shake, scrape, bow, spin, whip, or pluck sound from any sonorous object with the exact precision one would expect from digital control.
Installations and Performances in New York City (Harvestworks, Chelsea Art Museum, Gigantic Art Space, Eyebeam Gallery, Angel Orensanz Foundation, P.S.1 Clocktower Gallery, and The Frying Pan), and worldwide at 2004 NIME conference in Hamamatsu, Japan, 2004; iMAL festival in Brussels, Belgium; 2004 Audio Arts Festival in Krakow, Poland; When The Time Traveller Kills His Grandmother (a survey of current sound art curated by Fritz Welch) 2005 London, England.