A year ago, I had the pleasure of working with the great jazz pianist Rachel Z in my studio in Gowanus, Brooklyn. I showed her my live set-up and complained that, as a bass player, I really didn’t like to perform electronic music while standing up behind a table. She suggested that I should take the Ableton Push and mount it on my body and play it like a bass. We chatted about it and I realized it was a really cool idea.
Soon afterwards, I spent several weeks in Costa Rica and Colombia on a tour and in my spare time, I drew pictures of this Push-guitar. By the time, I got back to New York, I knew exactly how I would put it together. First, I pulled a bass into pieces. Then, I went to Home Depot and bought $18 of supplies – some clamps and some velcro and “Blinky” was born.
Blinky consists of an Ableton Push 2, a Korg nanoKONTROL 2, a Numark Orbit controller and a Keith McMillen Instruments 12 Step MIDI Foot Control. I use the Ableton Push 2 for making the music – playing it like a keyboard. The nanoKONTROL 2 is used for effects – filters and such, mutes, changing sounds and recording loops. The Orbit is used to filter the lead sounds and launch beats and clips and the KMI pedal is used to also launch loops when my hands are busy.
The programming in Ableton Live 10 is done using a lot of ClyphX Pro. On July 9th, I’ll drop another video with a more detailed explanation of the live set.
I think that we live in a really exciting time when laptops offer us unlimited sound possibilities, but the physical interfaces that we use to perform with laptops are still being developed. Blinky is just my attempt to create a digital performance interface that works for me as a bass player. I’m hoping that I can take it further than just a janky and slightly gimmicky mash-up of some controllers and prove to myself that I can create a musical experience with it that’s fun for me and hopefully to the audience as well. I also hope that it may inspire people to create their own music making interfaces that make sense for their own musical performances.
This video was recorded live on June 14, 2019 at King Killer Studios in Gowanus, Brooklyn, NYC. It’s completely built from loops created on the fly, with the exception of an ‘Amen break beat’ that I triggered at certain points in the song. The song is called ‘Prelude 4’ and it will be the first track of my upcoming record “Red Hook Sun” which will drop at the end of the summer. “Red Hook Sun” will have 5 songs composed on Blinky.
A big thanks to everyone who helped with this video: Alejandro Vega on drums and for the mix of the live version of the song ‘Prelude 4’ that we did here, Ian Elkind at King Killer Studios for engineering the session, Edgardo Parada of Shake Up Productions for shooting and editing the video and Bob Power for the mastering of the audio.
Hope you enjoy. Amor y Paz!
This past year, I had the pleasure of collaborating on a dance remake of the classic track “Be My Baby” with DJ E.M. and the amazing singer Nina Blue. All vocals and sound design were done in my studio, the Play Room in Gowanus, BK. Enjoy!
Happy to announce the release of a short E.P. of disco/house tracks that I’ve produced in the studio over the past several years. To read more about the background of the record and the great folks involved click here. You can also check it out on Spotify here
On Sunday, September 11, I’ll be giving a presentation on Sampling with Ableton Live at Brooklyn’s ShapeShifter Lab.
On August 26, 2016 I had the pleasure of adding generative electronics to the music of master jazz players bassists Matthew Garrison, Massimo Biolcati and drummer Nate Smith. The concept was to create music using only beats and sounds generated from Fodera basses (we were celebrating the release of Fodera’s new Imperial Mini MG bass. I started with an empty Ableton session. I had an audio feed where I was sampling Matt’s bass and creating sequences and loops with the sounds using the Push’s loop pedal function and ClyphX. The sequences were then chopped, effected, randomized and performed by the laptop – as a kind of mechnical improvisation – while the humans used it as the basis for more improvisation. We used Ableton’s Link to synch my laptop with Matt’s. Throughout the fall, I’ll be developing these techniques further at ShapeShifter Lab here in Brooklyn and hopefully performing more with some of these fantastic improvisors. ShapeShifter Lab is an amazing place to explore new techniques and sounds – one of NYC’s most creative musical spaces.
I’ll be doing a live set on Saturday, July 30 in San Jose, Costa Rica as part of the two Ableton courses I’m teaching down there. At the Feria Verde, San Jose, CR.
It’s been a while since I’ve posted here on this page. In the past year, most of my postings have been as part of the Brooklyn Digital Conservatory which I founded a year ago. I wanted to share my latest project which I’m beginning to perform live with.
Generative music is a term coined in the 1970’s by Brian Eno and it refers to music generated through an algorithm in a system. The artist then grabs what the machine generates and does their best to make art from it. I find to the concept fascinating because I see it as a metaphor for life and improvisation. Time and chance throw event and people at you and each day you essentially figure out ways to take what you’re given and make the best from it. Sometimes it’s beautiful, many times it’s not, but it is always truly original.
The instrument I’ve come up with (pictured above) I call a digital electronium in honor of one of the great musical geniuses of the 20th century, Raymond Scott. Scott’s electronium was an analog machine designed to generate patterns that the artist could capture and alter and create a kind of duet between the composer and the music. This is my goal for this live set – to essentially compose electronic music so rapidly that to any listener it sounds like a live performance, and yet every single sound is generated either through random algorithms in the machine, synth parts I play in live and loop and external audio, from my own voice or that of other musicians, I sample into the computer.
The rules I made in creating this set are the following:
1. Absolutely no pre-recorded audio can be used at all.
2. I must begin with an empty Ableton set with no clips containing either audio or MIDI
3. All sounds can only come from the following: 1. Sounds by random algorithms generated by computer-based synthesizers. 2. Sequences on computer-based synthesizers that I play live into the computer. 3. External audio including vocals, electric bass, any other live instrument, a transistor radio etc. etc.
I have two purposes in creating this set.
Firstly to push the boundaries in regards to live electronic music performance and move away from pre-recorded audio towards sound created for the place and time of the performance. I say this without any disrespect for DJs. In fact, I would like my set to sound at times like a great DJ set. DJs are some of my biggest inspirations and it is an incredible art, but I am and have always been a live musician and have been making music on instruments since the age of 4.
Secondly, to learn to produce and make music so rapidly with Ableton that I can compose in real time. I’ve always wanted Ableton to become a true instrument – just like my bass – one that I can use to rapidly create music on my own and with collaborators.
I will be doing a series of performances at Brooklyn’s ShapeShifter Lab this summer and fall and hopefully getting ninja on this new instrument.
I’ll be doing a live set tomorrow night at 9:00 with Comandante Zero at the MediaLab Dance Party in downtown NYC. Live electro-funk and some PUSH.