{{ searchResult.published_at | date:'d MMMM yyyy' }}

Loading ...
Loading ...

Enter a search term such as “mobile analytics” or browse our content using the filters above.

No_results

That’s not only a poor Scrabble score but we also couldn’t find any results matching “”.
Check your spelling or try broadening your search.

Logo_distressed

Sorry about this, there is a problem with our search at the moment.
Please try again later.

NPR Director, Audience Insight & Research

Once upon a time, NPR was an acronym for National Public Radio. That was then. Now, the company refers to itself as "a thriving media organization at the forefront of digital innovation." The formerly pureplay broadcast organization is now as likely to distribute its content via live streams, iPad apps and social media as over the airwaves.

We caught up with Lori Kaplan, NPR's director of audience insight and research, to discuss how NPR is measuring and acting on data from increasingly disparate sources, and to learn how non-profit metrics might differ from those used by traditional media organizations.

Can you talk a little bit about how you’re set up to do measurement at NPR and some of the metrics that are the most critical to you?

Sure. Just to clarify, are you speaking solely about our digital space?

Your digital space, but also how it relates to the larger broadcasting entity, because I’m assuming one informs the other. But you can correct me on that.

Tying the two together is a challenge. But certainly I can speak to the tools that we are using presently and the methods. For our day-to-day tracking, metrics tracking, we use Omniture to track all of our activity on the web. But we also subscribe to Nielsen NetRatings, so we can see how we fit in the competitive space. Additionally, we also use ForeSee to track overall resonance and satisfaction with the sites specifically.

What are some of the key metrics and the KPIs you’re looking at, given you're non-profit?

I don’t know that they’re particularly unique to the non-profit space, but maybe they are. On a weekly basis, we produce a dashboard internally. It has pretty wide distribution across the organization. We’re looking at page views and visitors and media requests and mobile page views. The dashboard’s pretty wide-ranging, trying to understand where our top sources of traffic are and our top stories. So it seems pretty typical for media organizations, as opposed to non-profits. We’re trying to track and understand engagement.

Do you have a definition of engagement?

Yes, exactly [laughs]! So, the items we list under the engagement metrics are all page views per session or sessions per visitor. Bounce rates, just some pretty typical online measurements. But we’re not necessarily satisfied with that, so I want to describe efforts we engage in outside of standard tracking, because that’s really important to us. Obviously, NPR formed as a radio entity first and foremost. Just for a little bit of history, back in the '80s there was a lot of resistance in the system internally to even track our performance because that was "too commercial". No one wanted to know how the programs were doing internally, because that’s not "us".

So, the heavy lifting had taken place during that time period in the mid to late '80s to get people to even pay attention and understand that it’s not a public service if people aren’t listening. I mean honestly, if we say that we’re a public service organization and we have a mission, in order to fulfill the mission we need people to listen and we need to track that and understand that.

You need a public.

Yes, we need a public, an audience. Right. So, we climbed that hurdle, and we do have attention being paid now to the audience. But the larger question that we ask ourselves is, "How is what we’re offering resonating with the audience?" To that end, we engage in monthly program resonance tracking. What does that mean? We have a panel of about 35,000 NPR listeners who’ve opted into participating in online research with us.

This is the panel I understand you set up.

Exactly. NPR Listens. Began it – oh gosh – was it 2006 or 2007? So, I helped launch that panel, and it certainly has grown considerably. We had a number of limitations regarding how we could recruit for that panel, because NPR, while we are a program producer and distributor, we are a member station organization, we do not own any radio signals. We can’t recruit directly on-air, so we partnered with stations to make that happen.

Program resonance tracking is simply a survey process we go through for each of our programs to understand the change and the relationship with what we’re offering, trying to understand why people are tuning in. Are the reasons for tune-in changing over time? Are there any changes in listening patterns? Then we ask a number of questions [such as] if there have been changes, what are the causes of those changes?

What form does this take? Is it an online panel?

Yes, it’s an online panel, and we use pretty standard surveys.

How much does the research that you do at NPR integrate with or trickle down to the member stations?

We’ve tried to create venues to share the information, but as with any complex organism, it’s kind of challenging. Even within a department, information-sharing isn’t perfect. When you think about within an organization and then within the larger entity, it’s challenging. We have an extranet site in which we share information. We present at various regional conferences. We do webinars. There are number of different ways in which we share information. But it’s not a perfect sharing of information.

Any plans to improve on that?

We’ve just actually as of this week launched an internal – and internal meaning NPR and NPR stations – blog. We have an external blog called Go Figure that’s open to anyone. But we have license limitations in terms of what we can share externally to the full public. With stations, we’re able to share more broadly. We’ve just opened that up this week, and we’ve had new people join. We want to share and be much more open than we have in the past.

I take it you’ll measure that as well.

Yes, of course.

NPR has been an early and avid embracer of social media. Can you talk about those efforts?

We have worked with our social media team to understand that space better. We’ve conducted a number of pieces of research for them on Facebook and on Twitter to understand what’s driving their interest, how often do they want to hear from us, what do they wanna hear from us and gauge their interest in a variety of different types of communication. I’d be happy to share the results of both of those studies with you.

In addition to measuring your web sites and, obviously, broadcasting, I imagine you’re collecting downloadable information as well. You’ve been very, very active in podcasting. Can you talk about those metrics and how you’re putting them to use?

I’m looking on our dashboard. We do track on a weekly basis how we’re doing, both in terms of just tracking downloads for each of our podcasts. We tend to look at an average monthly basis. We also on a weekly basis track our app downloads as well.

It’s [data] that’s shared throughout the organization. But while it’s something that’s shared, it’s not tied to—how do I say this? It’s not tied to an overall assessment. I don’t know that we've internally set goals that we'd like to achieve X by X period of time. We want to do well in the space. But there’s not a target goal we’ve set to achieve.

Can you give me some examples of how metrics have been acted on within the organization?

There’s a moment of silence, because I’m not in the best position to describe that. I know that the audience research department has a metrics component and has a primary research component. The primary research is something that has really been driving our actions over the past year or two, more than metrics tracking. So when I think about all of the product development that’s taken place over the last 18 months, whether it’s been the development of the iPad app, the news app, the music app or the revisioning of the NPR.org web site – and a lot of that has been driven by primary research and the feedback we were getting trying to understand the marketplace and our listeners and what their expectations were – and then once we acted on those and we saw—I mean, we saw outstanding uptake in what we were offering, so that we [regarded that] as success. But with the apps we didn’t necessarily have any benchmarks to compare them to outside of NPR.

One of the big problems any organization has is: We’ve got piles of data. What do we do with it?

Kinsey Wilson, who’s our VP of digital media, said, "You know, we’re getting all of this data, this monthly data, this weekly data. But as an organization, we need to step back and understand what the trends are, what are the key thematic issues emerging, so that we can get a bigger picture and understand what kind of decisions we should be making or thinking about going forward." I don’t feel like we’ve really cracked that nut.

A lot of the data  you’re looking at is really so new. You’re talking about data from your iPad app, and the iPad has been on the market, what, five, six months? Isn't it very hard to look at that in a historical perspective and start making decisions?

Absolutely, and we don’t have a lot of other competitive data, which is why I think we’ve relied a lot on additional primary research. When I say we’re resource-stretched, I mean how we’ve gone about doing that primary research is very, very scrappy. We recruit people off our panel and ask them to come in and speak with them, do sort of one-on-one usability testing or one-on-one interviews. For other items, we’ll take something out to our local coffee shops and sit down and recruit people there and give them $5 gift certificates at the coffee shop just to get them engaged in providing some feedback.

What are some of the more surprising findings you’ve uncovered in your tenure; data points, or trends or insights that really were unexpected?

I’m continually surprised. I’d rather talk about a challenge than a surprise, which is we see everything in silos at NPR. We measure our broadcast audience through Arbitron. We measure our online audience through Nielsen and Omniture. We have download information about our podcasts and simply about the iPad and iPhone applications. But we have no understanding of how each of these platforms is interacting with each other and how an individual is really using our content across platforms. It’s such a challenge for us to try to understand how to move forward without a better understanding of that cross-platform usage and the expectation—and how those expectations differ.

I think we’re starting to get a handle. We just completed a market segmentation study, and we have some clues and some hints about different—in general different expectations for different platforms. But we don’t really understand how people are using all these platforms together. I sound almost grim, because it just feels like it’s such a barrier to overcome. If we could understand that better, then we could—you just have a better sense of how to move forward. Right now, we’re taking some—we’re just taking some educated guesses.

If you see programming consumption or programming trends broken down for, say, different shows across platforms. I can tell you anecdotally as a listener, there are programs I only listen to via podcast, because they’re on at inconvenient times locally or they’re not on at all.

Right, oh definitely. So just to speak to that a little bit, our two biggest, highest performers on broadcast are "All Things Considered" and "Morning Edition". But in terms of podcasts, "Fresh Air" and "Talk of the Nation," maybe "Wait, Wait" top the list, because people might not be listening on the weekends, or people might not be able to listen in midday, but they still want to hear the content.

So there’s definitely a different pattern. Then in terms of social media, that’s very granular. It’s not—at the show level, it’s just— at the whatever Andy Carvin has decided to post, and it seems to resonate. You know, so that’s more of a story level. So, we see all of these things, but we don’t know is that the same person who’s listening to "Morning Edition" is also doing the podcast listening, is also clicking on the stories that have been posted to Facebook or to Twitter. That’s where the challenge comes in for us.

Then, of course, podcasts aren’t that measureable. You can measure downloads. But unless there’s streams you never know.

Oh, don’t even start! Exactly. We know downloads, but we don’t know listening. We had been approached three years ago by Arbitron, and they indicated they may be able to work with us to measure actual listening to podcasts.

But we’ve kind of moved on, at this point, to wanting all of mobile audio. But they’re not quite there yet. Their tool, the PPM measurement device, which is rolled in almost all the top 50 markets, or will be by December, the tool doesn’t necessarily have very good measurement of people using headphones. You’d have to attach the tool to a headphone measuring device and then put your headphones on. People just aren’t gonna do that. We haven’t pursued that option because we feel we’re not going to get good data. The point is we don’t have an adequate measurement system for actual listening in podcasts.

You’re hardly alone.

Right. When we look ahead we’re thinking about a few different trends that are taking place. We know that the 12 to 24 year old group—there’s a recent Arbitron study fielded, and the results indicate that back in 2000, 79 percent of 12 to 24 year olds were listening to radio in the morning. Now as of 2010, that number’s dipped to 41 percent. So there’s just a huge change in behavior. They’re listening to their Smart phones or their mobile devices.

And they don’t have television anymore.

Television - who needs it? You can stream what you want, right? So, there’s just a whole sea change in behavior. We know Pandora’s going to be in those cars real soon. Really soon. So what does that mean for us? We’re just trying to figure out how do we ensure our space, not just in the morning, but in the car as well, because we know that when people—it’s somewhat anecdotal. But when people put those apps on their mobile devices, Pandora shows up there. We need to make sure we’re in that space as well.

In the best of all possible worlds, what is the next tool or capability you would have for measurement?

There would be some way to understand all—the cross-- audio for us-- the audio listening and engagement across all of our distribution platforms.

What didn’t I ask you about your job or the tasks at hand that I should have or that you’d like to share in terms of best practices?

I kind of laughed when you were talking about best practices, because I felt like we’re still making our way. When our goal is really public service, how do we measure that? We had a number of ways we were thinking about measurements of public service and broadcast, and we had developed a number of metrics for that. But we don’t quite know what those corollaries are in the digital space. That’s another item we’re working on. How does that translate? How does the experience of the dedicated public radio listener translate in the digital space. What does that look like? It’s a challenge!

Rebecca Lieb

Published 11 November, 2010 by Rebecca Lieb

Rebecca Lieb oversees Econsultancy's North American operations.

Follow me on Twitter, or connect with me on Facebook.

160 more posts from this author

Comments (1)

Avatar-blank-50x50

Will

I want to know the future of local stations. With a few exceptions local stations have little to offer except a weather forcast (and I get that on the Internet). Will you have to continue beating the hell out of these stations for money? Why not just one big NPR fund raiser each year?

The future of local stations is online. A good outlook for them. They would have to look for underserved potential listeners and unique themes that the public could take or leave. I could see an annual or semi-annual directory of the 100 most interesting stations.

I dispise my local station. I use it only to get NPR news and all the chops-ups of the news for what passes as local interests has me turning off NPR altogether. This happens regularly. Once I'm on the oldies station it's rare to return to public radio.

almost 6 years ago

Comment
No-profile-pic
Save or Cancel
Daily_pulse_signup_wide

Enjoying this article?

Get more just like this, delivered to your inbox.

Keep up to date with the latest analysis, inspiration and learning from the Econsultancy blog with our free Daily Pulse newsletter. Each weekday, you ll receive a hand-picked digest of the latest and greatest articles, as well as snippets of new market data, best practice guides and trends research.