Music Research: Measuring Music Burn is Redundant
by Tracy Johnson
Programmers want to know everything possible when making music decisions. That’s understandable. The more detail and insight, the more we can fine-tune the playlist. It’s part of the pursuit of a perfect radio station.
However, in chasing that data, we often miss the forest because we’re focused on the trees. That’s why you don’t need to track fatigue ratings on songs. Asking questions about music burn is redundant.
At worst, including a separate question about burn may be damaging scores of hit songs. At best, you don’t need the question because you already have the information.
Five Reasons Music Burn Is Redundant
When testing music online, survey fatigue is normal. Streamlining the response process improves the listener experience. The easier it is to navigate your survey, the better your results.
Yet some stations ask for three separate responses for each song. With a survey of 20 or 30 songs, that’s an eternity for a music respondent.
They ask for 1) familiarity, 2) to rate the song and 3) whether they’re tired of it.
In focus groups, it’s common for listeners to say they “used to respond” to their a station’s music surveys. Why have they stopped? The most common response is it “takes too long” or “it’s boring”.
How long is too long? About five minutes, they tell us. That’s an eternity when completing an online survey.
Eliminate Useless Questions
So why ask questions you don’t need? Asking three questions per song adds length and adds survey fatigue to the process.
Not only does it slow the survey, it’s already baked into listener response when they rate the song. Think about it: If I’m tired of a song, won’t I rate it lower than if I love it? Of course!
Even if the occasional song that is still popular (in score) but “tired of” is starting to rise, does it matter? As long as they love the song, play it! And play it often. When popularity fades and burn rises, you’ll know. It’s reflected in lower scores.
You have the burn data, but it’s not shown as it’s own value. Asking burn question may be causing you to drop some of your most valuable songs.
Webinar on Demand: Music Testing Done Right
Music Burn is Relative
Whether a listener feels tired of a specific song is a relative term. What does it mean? One respondent may have a lower tolerance for burn than another. The question is subjective, so it’s not actionable.
Some folks are black or white respondents. They either love it or hate it. They are either tired of it or not tired at all. Others see the world in shades of grey.
Further, though I’m tired of a song, it doesn’t mean I don’t like it anymore. Nor does it mean I’ll turn it off. Today, in a world where so many stations share titles and music is everywhere, fatigue sets in quickly. Reacting to what was once considered high burn can lead to churning through new songs faster than we should.
Another factor is the composition of the music tested. It gets even more cloudy when recurrent or gold titles are next to newer or current songs. When an older song follows a recent one, burn perception increases. The fatigue score will rise more than when testing all gold or recurrent titles.
Music Burn is Irrelevant
As you can see, knowing if the audience is actually tired of the song to the point of tuning out is hard to measure. Some respondents have an attitude that all stations play all songs too often. When scoring burn, they aren’t reacting to the song as much as protesting repetition. This distorts the data and can screw up results.
However, if the structure of song rating system is solid, the information is already in your hands. If they’re tired of the song to the point that they’d tune out, it’ll score lower. How many times have you seen a top-testing song with a burn score of 40 or 50? And how often do you see a poor testing familiar song with a low burn score? Sometimes, but not very often.
If I don’t like the song, hearing it once is too much. If I love it, you can’t play it too much.
Music Burn Happens With Or Without You
Most stations view music research in a vacuum, through their own station’s lens. Some even ask the burn question of “Should we play this song more or less?”. That’s a weak question. It doesn’t take into account the variables that go into answering such a question.
Burn doesn’t happen because your station plays a song more often. Nor is it reduced if you play it less. Burn is station agnostic. Burn is cumulative, building every time an individual is exposed to a song. They may be hearing it on Spotify, YouTube, your competitors and on television. It doesn’t matter. Each contributes to listener perceptions.
There’s nothing you can do to manage how often other outlets play their songs. Don’t let rising burn scores cause you to reduce airplay on otherwise strong titles. Moving out the hits to make room for new, less familiar and less popular music is a bad trade. Guess what happens then? Yep, ratings decline.
Music Burn: What’s Your Number?
Finally, t’s impossible to determine the point that a song becomes too burned to play. When does burn percentage tip a song from power to secondary? Or from on the air to off?
Is your burn tolerance 20%? 25%? 30%? 50%? Why do you choose that percentage? It’s fueled by your gut. There’s nothing wrong with using your intuition. But isn’t the purpose of a music survey to get data to feed that gut? So how can you apply the burn score? At best, it’s an inexact science.
Again, if the score is holding up and favorites are high, does burn matter?
Asking About Burn Increases Burn
Finally, you may be creating your own burn problem by asking the question. There may not be a problem until you ask them to think about it.
Asking “Are you tired of this song” invites the respondent to evaluate the song, when in reality it may not be an issue. They may think, “Hmm, now that I think about it, yes I am tired of that song”.
So they click the box that says they’re tired of it. Or they may be a little tired, but they exaggerate how they feel and click on VERY TIRED.
It may have nothing to do with behavior, taste or feelings about the song, but asking a separate question turns distorts burn.
In a perfect world, we could ask many questions about each song. We’d compile a definitive profile that guarantees it’s in the perfect rotation. The station would play nothing but popular, familiar hits with no burn and high favorites.
But music, research and radio are messy. And the methods most stations use to gather music data is far from perfect.
Is it worth introducing burn fatigue in a music survey by asking an extra question? Is burn is even actionable data? Are you wasting respondent’s time and driving them away from your survey?
In the final analysis, what would you rather play:
A song that 1/3 of the audience indicates they’re tired of it, but still loves? Or a song that everyone rates as average, but has no burn?
Don’t over-think it. Play the hits. Ignore the burn
Tracy Johnson specializes in radio talent coaching, radio consulting for programming and promotions and developing digital strategies for brands.