Most of my life has been divided between my two greatest passions: arts and technology. I began engaging with music as a child, playing the saxophone at age 7 and moving to drums at 12. Music has always represented the best way to convey emotions for me. My aim is to recreate the same feeling of awe I experience when listening to a great piece of music or when witnessing an exceptional piece of art in my own work. At the same time, I’ve always loved the sciences and strive to find new ways to use my passion for technology to create art. I found the perfect synthesis of these two key aspects of my life in Berklee College of Music’s Electronic Production & Design major. My works space from audiovisuals, generative and algorithmic compositions; hardware hacking, custom software for sound processing and music synthesis, live coding, audio post production, mixing and mastering and VJ systems.
A selection of personal software projects.
Live Coding library written in Haskell using Csound UDP server as the audio engine. Ongoing open-source project started as my thesis for the Bachelor Degree in Electronic Production and Design at Berklee College of Music. For more detail visit the Github page and watch the short video presentation showcasing the state of the project at the end of my senior year at Berklee College of Music.
A Max for Live granular delay built with Csound to granularize audio in real time. Based on the partikkel opcode, inspired by Curtis Roads' "Microsound", this device is capable of an extremely wide range of effects and is perfect to process any kind of audio signal: from synthesizers to live rappers.
It can be downloaded from my Github page.
This is a vowel synthesizer built for the multimedia performance "A Rose Out Of Concrete" (see multimedia).
This instrument is based on the fof Csound opcode and uses Max/msp to receive data from sensors and midi controllers that modulate the instrument's sound.
Parameters have been also mapped to a Roli Lightpad Block controller to use the patch as a standalone instrument.
iOs development class projects
A number of simple iOs apps developed as class projects.
A selection of them is available on my Github.
Selected works in the field of multimedia art.
Performance in three acts by Joy Lee that explores the duality of emotions, from dark to light. My roles included performing modular synthesizers and real time processing of the audio from the other players through custom made Max/msp and Csound software.
The video showcases the first iteration of the project, presented on a 10.2 channels sound system with three video screens for projections.
Multimedia performance produced in collaboration with Nona Hendryx and Will Calhoun. My role in the project was to lead and coordinate a team of students whose objective was to create a website interface through which the audience would have been able to perform and affect the music during the performance, generate the sound effects to be controlled by this web interface, set up the network through which all the performance data was piped and distributed, and regulate how all the teams would use this network to share data with each other based on each team’s needs.
A ROSE OUT OF CONCRETE
A multimedia performance piece that is meant to be an experimental manipulation of the body to produce and control sound, and control lights and visuals through the use of sensor applied on dancers. Produced by Nona Hendryx and Hank Shocklee and coreographed by Duane Lee Holland Jr. I was part of the tech team lead by Dr. Richard Boulanger, in charge of developing the devices that reacting to the gestures of the dancers manipulated and transformed sounds, visuals and lights. Some of the software I specifically developed for this can be found in software.
Collaborated on a three-movement multimedia performance realized by A2daC as his final capstone project for the Electronic Production and Design major at Berklee College of Music. My main role in the project was to develop a series of custom Live devices to be used by the artist in the performance. The principal device I created is a Max for Live looper that would allow the player to record and manipulate music on the fly allowing the performer to shape the composition in real time.
Prototype of an audio visualizer application for augmented reality. Realized during a class in collaboration with students from M.I.T., Harvard, Boston Conservatory and Berklee. The goal was to prototype an audio visualizer for augmented reality to be used at music festivals and concerts to create a personalized experience of the event for every single attendant.
LIVE VISUAL ART FOR MUSIC EVENTS
I have been working with live audio visuals, using primarily Jitter from Cycling '74, for a wide array of different settings and musical styles: from Djs to live bands to jazz trios.
In the last few years I started getting involved in the Eurorack modular synthesizer community and I quickly got into DIY, building cases, synthesizer modules and circuit bending old toys.
Selected post production projects. My experiences include: mastering, mixing and sound design for linear media.
"PI" SOUND REPLACEMENT
Sound replacement of a scene from Darren Aronofsky's "PI" made for a class project.
DROIDX COMMERCIAL SOUND REPLACEMENT
Sound replacement of the DroidX commercial. This was a class project.