By Nicole Blanchett Neheli
The use of metrics and analytics is having a significant impact on news practice. I know this because I’ve spent years researching the use of audience data for my PhD. This included spending time in six different newsrooms within four different media organizations in three different countries, interacting with 69 participants, and completing more than 43 hours of interviews.
If you want the scholarly version of my findings, including a comprehensive literature review, details of the methodology, and outlines of practice from each site of study, you can access my thesis online. In this post, I offer an extremely truncated version to make the research more accessible to a wider audience.
Before getting into detail about how metrics and analytics are being used in a variety of newsrooms with a variety of funding models and audiences, it’s important to define just what I mean by metrics and analytics. In the process of my research, I noticed that scholars, journalists, and other newsworkers would often use the terms metrics and analytics interchangeably, even though they do not mean the same thing, or they might refer to a platform’s dashboard that showed metrics, or in other words the analytics systems themselves, simply as “analytics.” With input from study participants, I developed these definitions*:
- Metrics are units of measurement that reflect a specific element of audience behaviour (eg. pageviews, the numbers of “views” an article received)
- Analytics encompass the analysis of audience data as a means of performance appraisal on existing content and the development of hypotheses to improve audience engagement in the future (eg. if a lot of people like a particular story that can be perceived as a measure of success, a type of performance appraisal—but interpreting what it was about the story that people liked is the hypothesis, a prediction that the specific element can be replicated in another story to build engagement/traffic)
- Analytics systems are platforms specifically designed to aggregate, display, and assist in the reporting and analysis of audience data (eg. Chartbeat, Google Analytics, Omniture)
Now to the primary findings, which I’ll present in a list, followed by some context.
1. Promotional and developmental gatekeeping
Metrics and analytics are used in two very distinct ways in newsrooms. Specific metrics like pageviews are often used in real-time to help figure out where to position content on a website, what I term promotional gatekeeping. So, for example, if something is generating a lot of views/big response online, it will likely stay in a prime position on the website and be highly promoted in social media. If it is not attracting readers/attention, it will be taken off or moved farther down the website, something called “de-selection.”
A longer/deeper view of specific metrics/a combination of metrics, or the use of analytics, is what I call developmental gatekeeping. This happens when audience data are used to assist in editorial decisions like what stories to cover and how such stories will be formatted, what time of day they will be posted, and what platform(s) they will be published on. There are a number of factors that determine how metrics and analytics are used in an individual newsroom, but this often comes down to economics. Less money means less time and people to perform deeper analysis, and factors such as traffic targets tied to advertising can increase the chance of an organization putting clicks over quality.
2. Changing formats and routines
In an effort to expand reach and build scale, the use of metrics and analytics is changing and shaping newsroom routines and story formats, and such practice has the potential to impact public discourse in both negative and positive ways, what I describe as typical and positive media logic.
For example, if an organization is just shoveling whatever it can on its homepage to cycle ad impressions, that could be described as typical or even negative media logic. If an organization is using analytics to determine how best to format stories in a way that will get readers to absorb important information and pull them through a story, that’s positive media logic.
3. What you do has more impact than where you are
Differences in journalistic practice related to metrics and analytics, or how much the use of audience data impacts editorial decision-making, are often more pronounced when comparing position to position than they are comparing different organizations in different geographic locations.
For example, a digital editor and a reporter working in the same newsroom might have entirely different views on the value of metrics and analytics and, although the digital editor might use them consistently, the reporter may never look at them. However, digital editors managing websites in two different countries are likely both using metrics to position content on news websites. I observed this in everyday practice, including the use of of terms like “doing well,” “enhancement**,” or references to riding a “wave” of traffic that were repeated at my sites of study on various digital desks. This idea was affirmed further by other research, including a recently published article on African newspapers where observed practice and even some quotes from participants were strikingly similar to my own data. This isn’t surprising, though, when you consider how universally shared analytics systems and delivery platforms might foster shared practice and language.
4. The impact is bigger than acknowledged
Most journalists/newsworkers acknowledge their work is impacted, to varying degrees, by the use of metrics and analytics, whether that be in the way the content they create is promoted or the actual process of story selection and development. However, they sometimes understate the influence of audience data.
For example, a reporter might acknowledge that where their story gets placed on a website is largely based on metrics and/or analytics, but then say the actual content they produce is not impacted by audience data. At the same time, they might share how they’re being asked to brand themselves and create more versions of stories throughout the day, two strategies often related to increasing traffic.
5. Little time for reflection
It takes reflection for journalists/newsworkers to evaluate the influence of metrics and analytics on their work and how they might best use audience data; however, there is little time for such reflection. At one of my sites of study reporters were doing an average of 10 stories a day, serving multiple platforms and publications, and using templates that replaced production workers to lay out their stories. At all sites of study, most participants were working at a pace that left little time for deep thinking about anything.
6. Journalistic standards matter…when there’s time
Journalists/newsworkers still agree on traditional journalistic standards and that they have value, but they often can’t live up to those standards because of time, resources, or required routines. There is a spectrum of practice that each newsworker operates on, directly shaped by factors like the way the organization is structured, which is generally tied to economics.
For example, one digital editor talked about posting content that made her cringe but recognized posting that content meant the digital desk “did well” because of the number of pageviews the article garnered. Another reporter talked about strategies to quickly edit press releases so the publication still looked like a “real paper.” Several participants talked about the fact speed trumped accuracy in terms of things that did not damage the context of the story, such as spelling. Older journalists were often more keenly aware of slipping standards, perhaps because many younger journalists had never worked in an environment where there was a hope in hell of upholding such standards.
7. Professional boundaries are breaking down
Professional boundaries are breaking down in newsrooms as a result of changing routines, expectations, and external influences. Analytics companies directly influence production by placing importance on certain types of audience data, sales teams and news staff share the same audience data and sometimes develop content together, and the audience, through collected data, directly impacts decision-making.
At all of the organizations that took part in my study, there was effort being made, to varying degrees, to figure out what the audience wanted and then deliver it—although as I discuss further in my thesis, a lot of this figuring out involved using data that was misinterpreted, metrics that could be “gamed” or taken out of context, like pageviews, and little to no direct consultation with the audience either in-person or online.
8. Protecting territory
Change often results in discord and differences in relation to the acceptance and implementation of new practice. This is true anywhere, not just in a newsroom. However, in this study discord was often most evident between distinct groups, for example, reporters upset digital editors weren’t promoting their original work, journalists who did not want to take advice from technologists/data analysts on how best to format work for better engagement, and between generations of journalists, older versus younger.
9. Who is a journalist?
New roles and increasing tasks that come with changed boundaries are shifting not only what is considered to be acceptable methods of journalistic practice but who is considered a journalist. But, again, that doesn’t come without discord. One participant in my study referred to digital editors using metrics to manage the website as “technicians” not journalists, and those digital editors often did more curation than creation of content.
Here are a few other important notes based on what I saw in newsrooms.
Swiftly changing practice
Practice is changing quickly. Over the course of time I was gathering data, significant transformations were happening in the newsrooms I studied, some of which were in line with what is being described as an “audience turn” or, in other words, focusing on engagement with the audience versus just shoveling content at them. However, this is easier said than done—and how you’re engaging with your audience isn’t as easy to explain to advertisers or executives as something like a pageview that is similar to more traditional measures of success like circulation numbers or TV ratings.
That means, despite acknowledgment that doing so is ripe with pitfalls, and regardless if the newsroom is publicly funded or privately financed, traffic metrics still play a primary role, something identified in this Twitter thread in response to an article in the Columbia Journalism Review. Often, though, that’s not readily acknowledged by those working in a newsroom because it conflicts with journalistic values. However, more transparent discussion of the use of and reasons behind the use of metrics and analytics would serve everyone better. You need to honestly examine practice before trying to improve it.
The importance of the long tail
You might be familiar with the term “the long tail,” which basically means the long-term performance of a story/piece of digital content. Something that receives negligible attention when it’s first posted might slowly build significant views, or something that received a lot of attention at a certain point can resurface due to a second wave of sharing on social media.
There were multiple examples of this at my sites of study. This means three things: firstly, there is really more of what I describe as an “endless tail” versus a long tail and newsrooms need strategies for how to deal with a story that might pop back up in public discourse and be taken out of context. Secondly, ensuring accuracy the first time around with a story is especially important as you never know how or when information will resurface. Thirdly, it means looking at audience data in the short-term provides limited information. The most valuable data comes from comparative context within longer time periods. What appears to be widely popular in the moment may not even make it to the top 20 of most popular stories when looking at a month or year’s worth of data.
Importance of the homepage
As I have talked about previously, although social is important for drawing casual clicks, the homepage is often where loyal readers go to find content. As such, promotion and curation on the homepage should be a priority. However, that doesn’t mean the homepage needs to be continually curated like there is breaking news every second. More experimentation/study is needed in this area to find a better balance of time spent curating versus reward for that curation and audience expectations of how often a site needs to be updated.
Make video count
Another clear finding, supported by other research, is that hardly anyone goes to a news site to watch video. People prefer to read their news online and watch it on TV. That means if you’re going to make a point of using time and resources to put video on a news website—make it worthwhile. At one site of study, a reporter was asked to shoot video of a book to accompany a story. Just the book. Sitting there. That’s a waste of time and resources. Another site of study found 2.6% of videos posted garnered more than 50% of overall views and the best strategy was to post fewer but really strong videos, things that were better to see than read.
Support from the top down
Something else that was very clear was that the use of metrics and analytics had to be supported by everyone in management in order to change the culture of the newsroom to be accepting of the use of audience data. One of my participants said, “turning a metric into a goal…corrupts a number” because people will try to “cherry-pick” or interpret metrics in a way that best serves their own or their department’s goals. Evidence from my research shows that does happen. However, it also shows that news organizations that are candid about what the numbers are saying and how they are being used are more likely to reevaluate their goals in a productive manner.
This includes being willing to try everything and anything in terms of story formatting that might better engage the audience, looking at metrics on a deeper level to form better understanding of how to connect with the audience, and then consulting the audience to confirm data are being interpreted correctly.
Talk to the audience!
Although there is a certain amount of audience participation in the basic use of data to determine what the audience wants, what I call participative gatekeeping, it’s often surface level. Institutionalizing better feedback mechanisms, whether that be through social or even in-person interactions, and, therefore, building relationships, will increase the value of audience data and lead to conversations that will ensure coverage of what matters, and what should matter, to a specific community.
Metrics and analytics can lead to an obsession with attracting eyeballs over information sharing or they can help advance inventive storytelling and a better connection with the audience. Like so many other tools, their impact can be positive or negative…it’s how you choose to use them.
_______________________________________________________
*This definition was first published in this paper.
**Enhancement refers to the strategies used to promote a story on the website/in social in order to boost traffic/engagement, these commonly include changing the headline, picture, or where the story is physically positioned on the website.