Using Python to Control Ableton Live with MIDI

10 Jul 2023

python

ableton

mapping

...

myImage

Designing live performances for The Holy Mountain (my band) has always been challenging. During our live shows, we have to manage looping, sample triggering, complex tempo changes, and fast navigation between synthesizer presets. For our upcoming liveset, I have been trying to figure out how to play drums and synthesizers simultaneously, playing synth with my right hand and drumming with my remaining limbs. This problem inspired me to develop a clever way to navigate Ableton Live (Live) using very few midi controls.

In this post, I demonstrate how you can use some Python code (++) to acheive more complex midi mapping capabilities between your midi-controller and digital audio workstation (DAW).

Contents

  1. Overview and Requirements
  2. Python Setup
  3. Live Setup
  4. Concluding Thoughts
  5. Download Link and Source Code

Overview and Requirements

When searching for a solution to my "playing drums and synth at the same time" problem, I naturally wanted to use Ableton Live as my primary DAW software. One of the greatest things about Live is its rich midi capabilities and visual design, enabling control of just about anything from anywhere. However, Ableton alone does not provide users with progressive capabilities to customize their mapping schemas. Their platform is fairly restricted in these particular areas. So to achieve what I was looking for, I had to use additional code and software to manipulate and control Ableton from "afar".

The result was a mapping system built with Ableton and some Python software that enabled me to skip up and down scenes (vertical axis) and between audio tracks (horizontal axis) using very few midi controls. But navigating Live's visual interface like a 2D matrix is not uncommon. In fact, most midi-controllers afford this kind of behavior for traversing sample spaces and audio effects. What makes my system unique is that I use Python to associate custom collections of audio tracks with individual scenes in Live. With this mapping approach, it's possible (and easy) to jump between different mapping presets for each scene, dramatically limiting the number of controls needed to navigate fairly complex Live sessions.

myImage
Live's session view is an interface designed and made for live performance. Highlighted in red are scenes and tracks that together can be perceived as a 2D matrix structure.

Below you'll find important details about the design, code, and general Live specs I used to build this custom navigation system for my liveset. To be able to replicate my system, you only need a few key items:

  • A USB midi keyboard or controller
  • a Python environment with the rtmidi library installed

  • A virtual midi driver
  • Ableton Live 11 (++)
myImage

For the record, I use the MIDIPLUS X4 as a midi-controller and Tobias Erichsen's loopMIDI as my virtual device. For Mac/OSX users, I recommend setting up a IAC virtual midi bus. No third-party programs required.

Python Setup

The Python program sits in-between the controller and Live, and has two primary jobs; to intercept midi messages from the midi-controller, passing them off to Live, and to associate different collections of audio tracks with individual scenes. I start by representing each scene and audio track as a scene_config dictionary in Python. The dictionary key-value pairs represent Live's session view as a 2D matrix stucture. Each key represents a scene, and each value represents a collection (array) of tracks (integers).

# An example of a simple scene_config dictionary.
scene_config = {
1 : [0],
2 : [0, 1],
3 : [0, 2],
4 : [0, 1, 4] #scene nr.4 skips between track 0, 1 and 4.
}

To handle the track skipping process, I iterate and loop over the scene_config arrays using Python's Yield function in response to messages from my midi-controller. Yield is a type of Python Generator which are special functions that return lazy iterator. With these objects, you can asynchronously iterate over array items one-by-one without having to explicitly keep track of the previous or current index. As a bonus, Yield also automatically handles looping as it returns to start (0th index item) when iterating past the last item of an array (-1st index).

In the example below, I pass a Live scene from scene_config into a function called nextTrack. The function returns a generator object I can use to iterate over the audio tracks using the next() function.

# ...
# Use yield to asynchronously iterate and loop over arrays with next()
def nextTrack(scene):
count = 0
while True:
yield scene[count]
if count < (len(scene)-1):
count += 1
else:
count = 0
current_scene = 4
track = nextTrack(scene_config[current_scene])
print(next(track))
print(next(track))
print(next(track))
print(next(track))
# outputs:
# 0
# 1
# 4
# 0 - back to start

The second job of the Python script is to handle the midi communication between the controller and Live. I have to use two midi devices for this to work, one physical USB controller (the input) and one virtual midi driver (the output going to Ableton Live). Fortunatley, it's straightforward to initialize communication between multiple midi devices using Python rtmidi. In the below example, a simple midi input initialization process is defined using rtmidi. The program reads out the available midi devices and asks which device it should use as its input. Here, I would choose the physical controller as my input device.

# initialize midi in and out communication with a simple UI
def midi_INIT():
print('Available MIDI input ports:')
midi_in = rtmidi.MidiIn()
for port, name in enumerate(midi_in.get_ports()):
print(port, ': ', name)
input_port = int(input('which port should I get MIDI from?: '))
midi_in_opened = midi_in.open_port(input_port)
print('Receiving MIDI from port: ', input_port)
return midi_in_opened
midi_in = midi_INIT()
# outputs:
# Available MIDI input ports:
# 0 : Physical USB controller
# 1 : loopMIDI Port 1
# which port should I get MIDI from?: USER INPUT...

To initialize a midi output connection, I repeat the above code, only substituting rtmidi.MidiIn() with rtmidi.MidiOut() and using the virtual midi driver as my device.

To send messages, rtmidi objects have a send_message() method to use on initialized midi objects. As arguments, the method requires the midi channel, note value, and ON/OFF message.

# Send a note-on midi message on note 100 to the virtual midi driver
midi_in, midi_out = midi_INIT()
midi_channel = 1
midi_note = 100 # coming from you controller (+ offset)
note_on_off = 127 # 127 for ON, 0 for off
message = [midi_channel, midi_note, note_on_off]
midi_out.send_message(message)

Pro tip: If you're unsure about what kind of messages or values your midi-controller in sending out, you can print out raw messages from a rtmidi connection. I use this script to orient myself with new midi devices:

# src/keyboard-test.py
# Prints out the midi messages from your device to the console.
midi_in, midi_out = midi_INIT()
with midi_in:
prev_message = []
while True:
message = midi_in.get_message() # this is important method
if message and message is not prev_message:
print(message)
prev_message = message

Live Setup

In Live, I only need to do three things to enable the proposed 2D navigation. First, ensure that the scene_config dictionary corresponds to the number of scenes and tracks in my Live session. Second, enable the correct midi settings, and third, provide the correct midi mapping.

In response to points one and three, I wrote an additional Python script to guide me through the midi-mapping in Live based on my scene_config. Having a single source of truth is always a good programming strategy. The script calculates and iterates through all the possible midi values that need to be mapped to a parameter in Live. With a simple command line UI, I can select parameters in Live before triggering the Python script to send messages to Live's midi-mapping context, as seen in the image below.

To see the full code, visit the source code section below.

myImage
Making a `daw-midi-mapping.py` is recommended when your Ableton setup is tighly coupled with Python variables.

Finally, after midi-mapping, I have to set the correct midi settings. Per default, Live will usually recognize and enable any compatible midi device it detects for track, sync and remote. However, we dont want Live to listen directly to our midi controller. We only want to intercept message from Python on the virtual midi port we have initialized. To open the midi preferences in Live, Hit "ctrl" + "," and navigate to the midi section. Here, uncheck every device except for incoming messages on your virtual midi port.

Concluding Thoughts

So there you have it. We can easily expand the capabilities of our Live projects with some creative coding and a virtual midi driver. By using the free and open-source Python rtmidi library, bridging Live with Python is practically effortless. From there, I recommend optimizing your state management and asynchronous parameter control with Python Generators and lazy iterators. Together, this is a force to be reckoned with.

On a side note, there are alternatives to writing custom software for midi communication between Live and elsewhere. For instance, AbletonOSC is a MIDI remote script that provides an Open Sound Control (OSC) interface to control Live. Using OSC will eliminate the need for a virtual midi driver and provide more advanced remote control options.

Download Link and Source Code

The full source code of this project is open source and available on my GitHub.

If you want to contribute, you are free to do so. Simply fork the repo and make a pull request.

Leave a Comment