Why Can't I Listen to the Duruflé Requiem in the Car?

Why Can't I Listen to the Duruflé Requiem in the Car?

One of the side effects of my teaching position is that I’m spending a fair amount of time in the car. It gives me a lot of time to think and react to various media that I consume. I’ve also been working on implementing more robust recording archives for my program and figuring out some metadata aspects of that.  Recently, instead of thinking about recording, I was in the mood to listen to some Duruflé—particularly his requiem and the whole “Quatre Motets...” This was a task that should have seemed rather simple: pull up Amazon Music and ask.

Well, beyond Speech to Text having no idea who Duruflé was or that Ubi Caritas was something other than “duraface covy caritas”—don’t get me started on the larger work, apparently called “duraflake California motet”. When I did finally stop and manually type “Duruflé Requiem” into Amazon Music, it found me a couple versions that were fine, but after starting to listen, when my listening was interrupted, it returned me a completely different piece.

I couldn't find what I wanted, even though the tracks obviously exist. When I shifted my thinking from consumer to engineer, I realized that the barrier to getting my client's music out is more getting it found. To truly understand the friction in our modern listening experience, we have to look back at the industrial and technical foundations that brought us here.

The introduction of the 12-inch LP in 1948 allowed the Album to become the industrial center of the music business.  Everything from the cost of studio time to the scale of manufacturing and marketing made anything less than 40 minutes of product financially inefficient. It’s important to remember that classical music still held a significant market share and popular music was generally backed by big bands or orchestras. Every recording involved 30 to 60 highly skilled musicians with a significant cost in labor. Even the shelf space retail outlets could provide took too much space for a single three-minute song.

Multiple songs on an album also meant that the album was more administratively efficient than a single song. It was the central filing cabinet for the legal and data life of the music. Labels used the album as a clearinghouse to manage licenses and royalties for an entire collection at once. The customer may justify the expense of an album for a single hit, but the “deep cuts” would still be paid for. When we relate today's “metadata” to the traditional album model, the "liner notes" was the immutable database. Credits for composers, performers, and engineers were physically printed on the product. The context stayed with the audio because it was physically glued to the widget. Album catalog numbers, and eventually UPC codes, allowed all of the individual songs to be tracked as a single asset through the global economy.

When the 45 RPM record was introduced on the tail of the LP in 1949, it developed a symbiotic relationship with the LP that lasted for nearly five decades: the LP was the archival home for “The Work” while the 45 lived in the world of broadcast and discovery. In this ecosystem, the 45 functioned as a loss leader—a promotional probe sent out to test the market. By the time a single was selected, the money for the recording sessions had already been spent; however, there was a significant disparity in the production costs between the two formats.

An LP was a heavy, capital-intensive object. Beyond the raw weight of the vinyl, the cost of the cardstock, high-quality color printing for the jacket, and the labor of manual assembly created a high barrier to entry for a full-scale manufacturing run. If a label pressed 50,000 LPs that failed to sell, they were left with an investment in a warehouse of expensive assets with no viable return.

In contrast, the 45 was a lean industrial tool. Its light weight reduced shipping costs, and because it was directed toward broadcasters and jukebox operators rather than retail display, "shelf appeal" was irrelevant. The format dispensed with the expensive 12-inch album cover and the album credits that today we might call metadata. It often just shipped in a generic paper sleeve. The lower physical cost per unit allowed labels to test the market and then promote the full album. This efficiency in manufacturing was more than offset by a high administrative burden, however labels accepted the labor of managing a separate lifecycle of contracts, licensing, and marketing for each single to drive interest back to the album. The profitability of a successful album justified the initial recording investment while the single would minimize the risk of a premature, full-scale manufacturing run.

This model remained prevalent through the rise of Rock and Roll and well into the 1980s. Larger-scale works were common, and many pop artists used the classical multi-movement structure as a model for deeply constructed albums of their own. Groups like The Beatles, Pink Floyd, and Yes used the album as a cohesive canvas, treating the collection as a single artistic statement rather than a group of singles. For decades, the industry infrastructure protected this; because the listener had no way to "extract" a song from its physical container, the "Work" remained an indivisible unit of commerce.

A progression of consumer technology in the 1970s challenged the monolithic nature of the record production industry. For decades, labels held a physical monopoly on the "Administrative Anchor"; if you wanted the music, you had to accept the 12-inch gatefold or the 7-inch sleeve that came with it. The introduction of high-fidelity home cassette recorders in the mid-70s provided the first crack in that monopoly.

The 1979 release of the Sony Walkman capitalized on the home cassette recorder and took the audio out of the living room. It gave the consumer the power to decide where and how the music was heard. The culmination of this technological revolution was the rise of the all-important mixtape. For the first time, the listener acted as the primary curator, pulling movements or songs out of their original context and re-indexing them into a new, custom sequence. This freedom, however, came at a high data cost. When a listener recorded a track onto a cassette, the liner notes—the credits, the composer, the movement numbers—were left behind. The mixtape turned the "Work" into a collection of free-floating audio files, and the listener became a database administrator who frequently didn’t bother with (or didn’t care about) the album’s metadata.

The industrial logic of the "Single" was officially codified into technology in 1986 with the finalization of the ISRC (International Standard Recording Code). Originally intended to automate royalty tracking, the ISRC gave every digital track a unique serial number. This turned the "mixtape" logic into industrial law: for a computer to "see" a piece of music, it had to be an independent, addressable unit.

While the Compact Disc (1982) was designed to hold "The Work," the ISRC ensured the digital infrastructure behind it only cared about the "Track". The consequences of this standard exploded in the late 1990s. As CD-Rs became affordable and "ripping" software became mainstream, the "mixtape" logic went digital. The ISRC might travel with the ripped file, but the "Administrative Anchor"—the physical liner notes of the CD—was left behind. The "Singles" logic was no longer just a marketing tool for labels; it became the primary way consumers interacted with music. The track now had a digital address, but the "Work" had become invisible to the system.

In the physical era, the industry solved its financial risk by building a massive, one-way distribution machine. This system was designed for a world where a handful of all-powerful labels acted as the sole gatekeepers between the recording session and the retail shelf. It was a "One-Stop Shop" model: the label provided the capital, the production, and the manufacturing. Most importantly, they handled the licensing and royalty tracking—the business logic that was once directly related to the printing on the physical copy of the record.

Today, while the creation of music is decentralized, the distribution remains locked in the legacy system. Even though you can now record a Requiem in a cathedral with a laptop, the music still has to flow through "pipes" built in the image of the major labels. (To my fellow musicians: we are talking industrial pipes, not the ones in a Casavant organ—though the digital variety can be just as temperamental.)

In order to find where we need to tap into the distribution pipeline, we have to look at the Aggregator. For the independent artist or media engineer, the aggregator—companies like DistroKid, TuneCore, or CD Baby—is the only way to reach the retail "locks" of the major streaming services. They act as the digital post office that handles the logistics of formatting your work and managing the complex licensing and royalty transactions required now that the "pre-pay" physical model is dead.

The aggregator is the mandatory bottleneck where the "100%" is forced to translate their art into the language of the "99%". Because these platforms are designed for high-volume efficiency, their intake systems are built to be as rigid as possible. If the proxy’s form doesn't have a field for "Work Title" or "Movement Number," that "contextual DNA" effectively ceases to exist the moment you hit "Submit".

If your project fits the 99% mold, popular aggregators work perfectly. But if you fall into the 1%—the world of multi-movement works and complex ensembles—you don't just need a different service; you need a different map. As media engineers and educators, we become the Navigator between the artist and the distribution pipes. Knowing the standard path is easy, but it’s our job to understand the alternatives.

When we prioritize the specialized consumer experience demanded by classical music, we realize that the mainstream platforms (iTunes, Amazon Music, Spotify, etc.) are often the worst place to experience classical music. As we have discussed, the industry's preference for track-based music has created a limited number of metadata fields, and it’s really this information that drives the customer experience. In order for a service to have the data it needs to enhance the customer experience, it needs to get the information from somewhere. Once we find a destination, we need to make sure the aggregator supplying the platform can support the enhanced fields.

As I have discussed in previous posts, I use AI to help me write and do research. When using Gemini for this post, she was able to find me a list of platforms and aggregators that would cater to classical music. When I asked about her sources, she explained about something called the ERN (Electronic Release Notification) managed by the standards company DDEX. This is the (rather limited) data that the industry has settled on to supply information about songs being released.

Ignoring Apple Music Classical—which, despite its slick interface, hasn't fully fulfilled its promise of fixing the underlying data disconnect for those outside the Apple ecosystem—Gemini identified four primary platforms that recognize extended data from the ERN that allow it to be more appropriately accessed for the “1%”: IDAGIO, Presto Music, QoBuz, and Medici.tv. It then identified the aggregator services Naxos of America, iMusician, Symphonic, and TuneCore that support the ingestion or interface necessary for the specialized platforms to ingest the correct information.

Navigating these destinations requires looking for specific features that affect both the listener's experience and the artist's sustainability. This involves evaluating how a platform handles relational search, whether it supports logical movement grouping to keep a work cohesive, and if it utilizes more equitable payment models—like per-second royalties—that don't penalize long-form recordings. Ultimately, we are looking for a destination that preserves the archival context of the music while providing a functional interface for the user.

For those looking for a deeper review of how these specialized platforms compare in the real world, the July/August 2025 issue of Strings Magazine has an article by Pat Moran that provides a decent discussion and review of some of the top specialized platforms for classical music. I’m also including a summary table of some top platforms and their participating aggregators at the end of this post.

The modern media engineer—and even today's classical artist—is now tasked with the stewardship of not just the performance, but also the data about it. If we rely exclusively on the standard industry pipes designed for the 99%, the context of our work will inevitably be stripped away, leaving the listener with a fragmented and frustrating experience.

By identifying specialty destinations that respect the "Work" and choosing aggregators that support relational metadata hooks, we ensure that the archival integrity of the music survives the transition to the digital shelf. Understanding this infrastructure is what allows us to bridge the gap between a recording session and a listener who just wants to find the Duruflé without the system losing its way or hearing a voice assistant turn a masterpiece into "duraflake California motet."


Platform (Destination)

Specialized Aggregators

Key Feature Considerations

IDAGIO

iMusician, Naxos of America

Relational Search: Built for Composer/Work/Movement search. Fair Payout: Uses a "per-second" royalty model rather than "per-play" to support long-form works.

Presto Music

Naxos of America, Symphonic

Contextual Anchor: Prioritizes the "Liner Note" experience with high-resolution audio, digital booklets (PDFs), and integrated press reviews.

Qobuz

iMusician, TuneCore (Pro)

High-Fidelity Discovery: Focuses on high-resolution audio delivery and extensive editorial context for the discerning audiophile.

Medici.tv

Naxos of America, Symphonic

Visual Archive: Specialized in high-fidelity video distribution for concerts and operas, maintaining detailed artist and movement credits during playback.



Comments

  1. Thank you for your research and organizing of the thoughts of which many of us have but do not dive into the rabbit hole to answer.

    ReplyDelete

Post a Comment

Popular posts from this blog

What's The Point?

From Alchemy to Architecture

Welcome to The Boyd Arts Blog