ElevenLabs and the new logic of licensed AI music
For the past year, the AI music debate has been staged as a fight over permission.
Did the models train on copyrighted work without consent? Would rightsholders sue? Could a licensing framework be built that made generative music commercially legitimate rather than legally radioactive?
ElevenLabs is betting that the answer is yes. Its Music Marketplace, built on licensing deals with Merlin and Kobalt, is an attempt to do AI music the proper way: opt-in, compensated, guarded, legible. In an industry still dealing with the fallout of lawsuits against Suno and Udio, that alone is enough to make the launch significant.
But it also makes the launch more revealing.
Because once an AI music platform starts to solve the consent problem, the industry runs into the harder question underneath it: what happens when generative music is not just legally cleaner, but actually usable at scale?
That is the real significance of ElevenLabs’ move into music. Not that it has built another text-to-music model, but that it is trying to build a governable one.
Founded in 2022, ElevenLabs became one of the dominant names in AI voice by turning text-to-speech and voice cloning into infrastructure used across media, publishing, entertainment, and enterprise. Its move into music was presented as a natural extension of that broader audio identity, but it is also something more consequential than adjacent product expansion. It is an attempt to apply the company’s existing marketplace logic — already visible in its voice products, where creators can license synthetic voice replicas in exchange for royalties — to music generation itself.
The timeline has been unusually compressed. Eleven Music launched in August 2025. By March 2026, Music Marketplace was live as a commercial layer on top of the generation engine. This is not a speculative side project. It is capital flowing toward the construction of a licensed AI media stack.
The obvious comparison points are Suno and Udio, whose rise helped define AI music as a copyright confrontation. ElevenLabs is trying to arrive at a similar destination through a different route. Where others launched first and litigated later, it has licensed first and wrapped the model in contractual architecture.
That architecture is the story.
The most important deals are with Merlin and Kobalt. Merlin, which represents 30,000 independent labels and distributors, gives participating members an opt-in pathway to allow their works to be used in AI training in exchange for royalties. Kobalt brings publishing into the structure. Together, the two agreements create something the AI music sector has mostly lacked: a framework that treats music generation not simply as an extraction event, but as a rights market.
That does not mean the framework is comprehensive. It is not. But it does mean ElevenLabs has moved the conversation.
The Kobalt deal in particular contains two provisions that matter well beyond this product.
The first is parity. Revenue from Eleven Music is split 50/50 between compositions and recordings used to train the model. That may sound technical, but it cuts directly against one of the defining economic distortions of the streaming era, where publishers and songwriters have routinely ended up with worse terms than labels. AI licensing could easily have reproduced that imbalance. Instead, Kobalt appears to have forced a different principle into the center of the deal: that the song itself is not a secondary input into the model, but a co-equal source of value.
The second is the most-favoured-nation clause. If any recorded-music rightsholder later negotiates better terms than Kobalt, Kobalt’s deal automatically upgrades to match them. That is not just a defensive measure. It is a pressure point. It means any future dealmaking with larger rightsholders happens under the shadow of publishing parity.
And that is where this gets interesting.
For the past year, the music business has often treated AI as a threat when it arrived without permission and as an opportunity the moment it became licensable. Litigation and licensing have increasingly become part of the same negotiation cycle.
ElevenLabs, by contrast, does not yet have the majors inside its system. That absence is both a limitation and a signal.
It is a limitation because, without major-label catalogues, the training base is narrower and the product remains less powerful for mainstream commercial use cases. Users can prompt by style rather than artist name, but the absence of the biggest repertoires still matters. If you want to become the default infrastructure for commercially useful AI music, rights breadth matters.
But the majors’ absence is also a signal because it exposes the tension inside ElevenLabs’ approach. The company may have built a cleaner and more principled licensing architecture than some rivals, but precisely because of clauses like parity and MFN, it may also be offering recorded-music rightsholders less room to dominate the economics. A system designed to be more balanced may prove less attractive to the biggest players used to extracting imbalance.
That is why the legal structure matters so much. It does not just reduce risk. It redistributes leverage.
For independent labels, this is where the story is easiest to read as good news. Merlin members get a new revenue stream, opt-in participation, and a framework built around formal consent rather than retroactive damage control. For a sector that has spent the past two years watching AI companies build on copyrighted culture while insisting that licensing could come later, that is not nothing. It matters that one of the first scaled attempts to commercialise generative music through licensing has given independents a way in.
But good news at the level of rights holders is not the same thing as safety at the level of working musicians.
That is the contradiction that keeps surfacing across the AI music economy. A platform can become more ethical in the narrow sense of permission and compensation while still intensifying the broader conditions that make creative work more precarious.
The most obvious pressure point is saturation. Even if ElevenLabs is more carefully licensed than its rivals, it still contributes to a market in which music becomes dramatically easier and cheaper to produce. For independent artists already competing inside an oversupplied attention economy, a flood of generative music does not need to be artist-specific to be destructive. It only needs to be abundant.
The first part of the market likely to feel that pressure most acutely is production music: stock audio, background cues, commercial-use tracks, functional music made to specification rather than fandom. This is exactly the terrain ElevenLabs is targeting. Brand-safe background music, game soundtracks, social content, advertising — these are not marginal use cases. They are the practical center of the music licensing economy. They are also the categories where speed, price, and flexibility often matter more than authorship.
That exposure is not abstract. CISAC’s (The International Confederation of Societies of Authors and Composers) economic study projected that, under current conditions, 24% of music creators’ revenues could be at risk by 2028 as generative AI scales into streaming and music-library markets.
In other words, the most commercially plausible use cases for properly licensed AI music map almost perfectly onto the sectors where human composers have historically made a living by being reliable, adaptable, and fast.
If the benchmark is whether rightsholders consented, ElevenLabs is making a far more serious effort than much of the field. If the benchmark is whether songwriters get treated as equal contributors to model value, the Kobalt structure may even represent progress relative to streaming. But if the benchmark is whether musical labour as such is being protected, the picture is much less reassuring.
Session musicians are the clearest example. They often work on a work-for-hire basis and do not own copyright in the performances they create. That means even when labels and publishers negotiate compensation frameworks for AI training, a large class of working musicians remains outside the protection perimeter. Their labour may help make the recorded ecosystem valuable, while the rights structure routes payment elsewhere. The platform can be fully licensed and still leave them exposed.
For all the focus on whether AI companies are stealing from artists, the deeper shift may be that they are learning how to pay the people with legal rights while still destabilising the people whose labour sits underneath those rights. Once that happens, the argument changes. It is no longer mainly about infringement. It becomes a question of substitution, bargaining power, and which kinds of music work remain economically defensible once generated supply becomes cheap, licensable, and endless.
Even the copyright issue around AI-generated outputs points in that direction. Value accrues around the system, but not necessarily around a creator in the conventional sense.
That is why ElevenLabs matters: it forces a more mature version of the argument.
It is easy to denounce AI music when the companies involved appear cavalier about copyright. It is harder when one of them starts building the system critics said they wanted: licensed inputs, royalties, guardrails, restrictions on outputs, commercial clarity. Once that exists, the industry loses the comfort of treating ethics as a simple matter of consent.
Consent matters. Compensation matters. A licensed model is better than an unlicensed one. But a licensed model does not solve the economic shock of generative media. In some respects, it may accelerate it by removing the legal friction that kept mainstream commercial adoption slower and messier.
That is the hinge point here. ElevenLabs may be building the most legitimate version of AI music so far. It may even help establish precedents that songwriters and publishers should want to defend, especially if publishing parity holds. But legitimacy is not the same as protection.
Licensing may settle the copyright argument without settling the labour argument.
And once AI music can be licensed, monetised, and folded into the ordinary rights economy, the real question is no longer whether it belongs inside the market. The question is what that market now rewards, what it makes disposable, and whether “responsible AI” is really a safeguard or just a smoother path to the same underlying displacement.
ElevenLabs is the clearest sign yet that the argument is moving on, not away from copyright, exactly, but past the point where copyright alone can explain what is at stake.
-Maureen
Image created using OpenAI


