Leonardo Foletto
Leonardo Foletto
Audio Programmer | Sound Designer | Multimedia Artist
Screen Shot 2018-04-13 at 15.52.18.png


Italian Born, BROOKLYN based Leonardo Foletto offers custom software solutions for audio and multimedia applications, audio post production services and creative programming consulting.

HIS areas of work include development of digital signal processing algorithms, live coding, algorithmic and generative composition, computer-musician interaction, and multimODAL approaches to music composition and performance.



Most of my life has been divided between my two greatest passions: arts and technology. I began engaging with music as a child, playing the saxophone at age 7 and moving to drums at 12. Music has always represented the best way to convey emotions for me. My aim is to recreate the same feeling of awe I experience when listening to a great piece of music or when witnessing an exceptional piece of art in my own work. At the same time, I’ve always loved the sciences and strive to find new ways to use my passion for technology to create art. I found the perfect synthesis of these two key aspects of my life in Berklee College of Music’s Electronic Production & Design major. My works space from audiovisuals, generative and algorithmic compositions; hardware hacking, custom software for sound processing and music synthesis, live coding, audio post production, mixing and mastering and VJ systems.



A selection of personal software projects.



Live Coding library written in Haskell using Csound UDP server as the audio engine. Ongoing open-source project started as my thesis for the Bachelor Degree in Electronic Production and Design at Berklee College of Music. For more detail visit the Github page and watch the short video presentation showcasing the state of the project at the end of my senior year at Berklee College of Music.

I had the honor of presenting a paper about Kairos at the 5th International Csound Conference. You can read it here.



A Max for Live granular delay built with Csound to granularize audio in real time. Based on the partikkel opcode, inspired by Curtis Roads' "Microsound", this device is capable of an extremely wide range of effects and is perfect to process any kind of audio signal: from synthesizers to live rappers.
It can be downloaded from my Github page.

Vowel Synth

This is a vowel synthesizer built for the multimedia performance "A Rose Out Of Concrete" (see multimedia).
This instrument is based on the fof Csound opcode and uses Max/msp to receive data from sensors and midi controllers that modulate the instrument's sound.
Parameters have been also mapped to a Roli Lightpad Block controller to use the patch as a standalone instrument.




The first granular engine I've ever built; this device is a granular sampler made with Max/msp.
The patch is available on my Github page.

iOs development class projects

Screen Shot 2018-10-26 at 5.53.03 PM.png

A number of simple iOs apps developed as class projects.
A selection of them is available on my Github.



Selected works in the field of multimedia art.



Performance in three acts by Joy Lee that explores the duality of emotions, from dark to light. My roles included performing modular synthesizers and real time processing of the audio from the other players through custom made Max/msp and Csound software.

The video showcases the first iteration of the project, presented on a 10.2 channels sound system with three video screens for projections.


Multimedia performance produced in collaboration with Nona Hendryx and Will Calhoun. My role in the project was to lead and coordinate a team of students whose objective was to create a website interface through which the audience would have been able to perform and affect the music during the performance, generate the sound effects to be controlled by this web interface, set up the network through which all the performance data was piped and distributed, and regulate how all the teams would use this network to share data with each other based on each team’s needs.



A multimedia performance piece that is meant to be an experimental manipulation of the body to produce and control sound, and control lights and visuals through the use of sensor applied on dancers. Produced by Nona Hendryx and Hank Shocklee and coreographed by Duane Lee Holland Jr. I was part of the tech team lead by Dr. Richard Boulanger, in charge of developing the devices that reacting to the gestures of the dancers manipulated and transformed sounds, visuals and lights. Some of the software I specifically developed for this can be found in software.


Collaborated on a three-movement multimedia performance realized by A2daC as his final capstone project for the Electronic Production and Design major at Berklee College of Music. My main role in the project was to develop a series of custom Live devices to be used by the artist in the performance. The principal device I created is a Max for Live looper that would allow the player to record and manipulate music on the fly allowing the performer to shape the composition in real time.

visuals: Jacob Johnson | dancer: Erica Codd | videographer: Fen Rotstein

Screen Shot 2018-05-02 at 11.44.55.png


Prototype of an audio visualizer application for augmented reality. Realized during a class in collaboration with students from M.I.T., Harvard, Boston Conservatory and Berklee. The goal was to prototype an audio visualizer for augmented reality to be used at music festivals and concerts to create a personalized experience of the event for every single attendant.


I have been working with live audio visuals, using primarily Jitter from Cycling '74, for a wide array of different settings and musical styles: from Djs to live bands to jazz trios.





In the last few years I started getting involved in the Eurorack modular synthesizer community and I quickly got into DIY, building cases, synthesizer modules and circuit bending old toys.

The first Eurorack case I've built.

The first Eurorack case I've built.

Performing at Berklee's Modular on the spot with Stefano Genova.

Performing at Berklee's Modular on the spot with Stefano Genova.

Circuit bent Speak & Math. Started as a class project.

Circuit bent Speak & Math. Started as a class project.

Noise box synth built circuit bending a 555 based soldering kit. This was a class project for EP-391 Circuit Bending & Physical computing.

Noise box synth built circuit bending a 555 based soldering kit.
This was a class project for EP-391 Circuit Bending & Physical computing.

Performing Joy Lee’s composition “The Fog“ for the Society of Composers Spring 2019 Concert at Old South Church in Boston


Post PRoduction

Selected post production projects. My experiences include: mastering, mixing and sound design for linear media.



Sound replacement of a scene from Darren Aronofsky's "PI" made for a class project.



Sound replacement of the DroidX commercial. This was a class project.



Please complete the form below

Name *