Mike Slinn

Controlling Ableton Live From LLMs

Published 2025-09-08.
Time to read: 3 minutes.

This page is part of the av_studio collection.

Large Language Models (LLMs) can be integrated with Ableton Live for tasks like controlling the DAW with natural language, generating MIDI patterns, or automating production workflows, often using technologies like Model Context Protocol (MCP), AbletonOSC and sometimes Max for Live.

Let the LLM do the dirty work

Creative people need to find effective ways to collaborate with LLMs. LLMS support natural language interaction with Ableton users as they generate and edit creative works with Ableton Live.

Overview

  1. An MCP server is set up that uses AbletonOSC (an open protocol for control) to communicate with Ableton Live. AbletonOSC acts as a virtual MIDI controller.
  2. An LLM like Claude or ChatGPT is used as the MCP client.
  3. The user provides natural language commands to the LLM (e.g., "create a new MIDI track" or "add more energy to that bassline").
  4. The LLM interprets the prompt and sends the corresponding OSC commands to the Ableton Live server.
  5. Ableton Live receives the commands and performs the requested action, such as creating a track, generating MIDI, or adjusting parameters.

Examples and Applications

  • Generative MIDI: LLMs can generate melodies, chords, and rhythms based on custom constraints provided in text prompts.
  • Workflow Automation: Offload repetitive tasks, like changing instrument routings or adjusting BPMs, using simple natural language commands.
  • Creative Collaboration: Use the LLM as a virtual collaborator, giving general creative directions to generate variations or ideas.
  • Building Sets: Ask the LLM to construct a basic setup for a chamber piece, including tracks and scenes.

Ancient History

This technology is evolving rapidly, and changes month-to-month. In 2024, we were amazed with what AI assistants could do. In 2025, we moved forward to MCP applications.

This is an excellent overview video of the previous level of integration (pre-MCP) between Ableton Live and an LLM:

Assistants

These ‘old’ assistants do not use MCP so they run in a command line, and do not directly interact with Ableton Live.

ableton-live-assistant

ableton-live-assistant controls Ableton Live 11+ with GPT-4 using Node.js.

Ableton Live Ultimate Assistant

The most powerful and trained Ableton Live Assistant, designed for all software versions. Our model is finely-tuned for top-notch guidance and troubleshooting, providing an interactive and user-centric experience. Now includes updates and tool recommendations.

Modern MCP Servers for Ableton Live

The Model Context Protocol (MCP) is an open standard, open-source framework introduced by Anthropic in November 2024 to standardize the way artificial intelligence (AI) systems like large language models (LLMs) integrate and share data with external tools, systems, and data sources. Within 6 months MCP had become commonly used technology.

MCP servers improve integration between Live and LLMs, so running command lines in a terminal next to Live is no longer required. Instead, a chat dialog next to Live is used, or voice commands.

Install these MCP servers in any MCP Host.

Below is a table all known MCP (Model Context Protocol) servers for Live, listed in order of popularity. These servers facilitate communication between Large Language Models (LLMs) or AI assistants and Live, typically using OSC (Open Sound Control) or socket-based systems for music production automation and control.

MCP Server Name Description Key Features Requirements
Ableton MCP Enables AI assistants to control Ableton Live via a Remote Script for two-way communication, enabling music production tasks like MIDI manipulation and session control. This GitHub project has 1900 stars and 218 forks. Seems to only work with Claude Desktop. Uses Docker for no reason.
  • Track/clip creation
  • MIDI editing
  • Playback control
  • Instrument loading
  • Library browsing
  • Live 10+
Ableton Live MCP Server Implements MCP to connect LLMs with Ableton Live, and uses AbletonOSC for sending/receiving messages. This GitHub project has 317 stars and 44 forks, however it is not packaged; users must clone the git repository and build the project. The documentation makes it seem like Claude is required. 75% of the issues have no response.
  • Python 3.8+
Ableton MCP Extended Robust MCP server for natural language control of Ableton Live via AI assistants like Claude or Cursor. This GitHub project has 53 stars and 6 forks, but it is only 4 months old. The support seems to be better than for all the other mentioned in this table.
  • Session/transport control
  • Track management
  • MIDI clip/note manipulation
  • Device/parameter control
  • Python 3.10+
  • Live 11+
Ableton Vibe Not well documented. Seems to require Claude. 6 stars, 2 forks. Seems like an amateur hack that will be quickly forgotten.
  • Task automation
  • Python 3.8+

Most servers require Python 3.8 or higher, the uv package manager, and dependencies like python-osc, and AbletonOSC, and fastmcp. Live versions 10 or 11 are common prerequisites; I have not seen any mention of Live 12 yet with respect to these servers, even though Live 12 was released 17 months ago.

* indicates a required field.

Please select the following to receive Mike Slinn’s newsletter:

You can unsubscribe at any time by clicking the link in the footer of emails.

Mike Slinn uses Mailchimp as his marketing platform. By clicking below to subscribe, you acknowledge that your information will be transferred to Mailchimp for processing. Learn more about Mailchimp’s privacy practices.