Adjusting strategies and tactics in newsrooms used to take months of work (and a lot of faith in limited info). Now, AI and data tools can help you learn and make changes in minutes. It’s time for your newsroom to catch up.
By Tim Wolff, Vice President of TV and Digital Publishing Innovation, Futuri
It was the big moment in every newsroom: the day the sweeps ratings arrived. Early in my career, “diary markets” were prevalent. A small sample of viewers would be given books to write down what they watched each day. At the end of the sweeps month, they would mail those books to the ratings company, who would compile them, turn them into ratings spreadsheets, and send them to the stations. Newsrooms wouldn’t find out how many people watched their newscasts until weeks or months after the newscast aired (assuming, of course, people religiously wrote down exactly what they watched and when they watched it, which was far from a sure thing).
That was why I couldn’t wait every day for the overnight ratings, driven by meters placed on some viewers’ televisions. As a news producer, I wanted to see what kind of impact the hundreds of little decisions I made had on the ratings. Getting a spreadsheet the next day that showed how the newscast performed in 15-minute increments was the best I could do in trying to figure out whether my strategy and tactics had worked.
In truth, it didn’t help much on an individual day. Seeing a tenth of a point change day over day in a quarter hour could be from so many factors, ranging from the quality of the newscast to the weather outside to the level of excitement in a baseball game on another channel. There were rare occasions when it would show a dramatic mistake—like country music interviews at 10pm that saw our second quarter drop 10 points—but those were rare. Having overnight ratings was valuable in long-term assessments, but not so much for all the decisions that go into a live television newscast.
Then I began to get really excited about minute-by-minute reports. These were detailed reports, showing each minute of the newscast with ratings increases or decreases, and included all the other channels, too. That way, I could see if my B-block tease kept viewers around or sent them flipping to the competition. It gave some idea about which stories were retaining viewers, and which were turning them off. It wasn’t perfect, since many stories are shorter than a minute, or overlap into different minutes. It also wasn’t available until 3 days after the newscast aired, at which point I’d usually forgotten what was even in that newscast. Still, with a lot of tracking and work, I could use them to learn about what did or didn’t make viewers go away.
The digital metrics rush
When I first came to digital in the mid-2000s, the availability of metrics was like an adrenaline rush. I could see how well each individual story performed. It wasn’t in real-time then, with a lag of a few hours and a full report not ready until the next morning. Still, it was amazing to me, as I could track all these individual stories to learn the trends of what performed well on our website (there were no apps or real social media then).
I moved back to broadcast for a while, and metrics tools improved. Eventually, we had a tool that showed real-time performance for individual stories on a monitor right above my desk. I’ll never forget how shocked I was when I glanced up and saw a relatively minor story getting 600 live page views (about 10 times our typical top story then). I asked one of our digital team members what was happening. It was Facebook, she told me; the story had gone viral, and it appeared there wasn’t much rhyme or reason to it.
Over time, as I returned to focusing on digital, the measurement tools got faster and helped us focus on which stories to spend our limited resources perfecting and promoting. But they were all really doing the same thing: tracking metrics. All these tools could tell us what our consumers were consuming, but the tools couldn’t tell us why, or what might work next. For that, we needed teams of smart people analyzing and tracking what worked so we could refine our strategies and help our producers figure out what they should post—and when they should post it. Given the mercurial nature of which stories take off on social and which ones fail, it was still a guessing game.
Data helped define strategy…slowly
As we gathered more data on our social brands (and even on our talent’s social brands), we were able to continuously refine strategies, and help give everyone a better chance of creating winning social posts. But even with our team focused on the metrics, it was a ton of data to manage, especially as we tried to be competitive with posting 24/7. What I really wanted to be able to do was track data across our entire market, not just on our brands. Of course, there was no way our team was going to be able to effectively do that. Having people dive in to generate reports just on our pages and our competitors’ pages was already an onerous task—and accounted for just a slice of the social media usage among our consumers. More and more, the data was becoming available, but there was no way for us to track and analyze it all.
That’s where “big data” processing and machine learning really began to come into play. We needed tools that could ingest data from all the millions of posts our consumers were seeing, then take that data and present it in a way we could learn from it and use it. The next game-changer was machine learning, the artificial intelligence that could spot the trends in all that data. As more and more data came into the machine, the AI could learn not only what was working, but what was likely to work. This created an amazing opportunity.
Futuri’s TopicPulse is at the forefront of this AI in the newsroom revolution. With years of data from more than a hundred thousand sources, the tool has insights from the performance of trillions of posts. With AI, TopicPulse uses those insights to be able to predict which posts are going to go viral—and which ones are about to lose all their momentum. This gives newsrooms actionable, immediate information that our digital producers and social teams can use to truly engage their consumers, drive traffic, and make sure that they are spending their time on the stories the community wants.
Even with our great team using human eyeballs to pore over a (relatively) small data set, we would never be able to use data like that. Instead, our team could focus on writing better posts instead of staring and spreadsheets and looking for trends.
AI is coming…AI is here
AI has a growing presence in newsrooms in other ways, too. There are a number of projects that have studied how journalists write stories, and AI allows computers to take a set of facts and create stories that read as though journalists wrote them. These are common in financial reporting and sports (some video games like FIFA and Madden were early adopters). You’ve probably read these stories and not even realized they were written by computers.
Recommendation engines on websites have been powered by data and personalization for years (what did Lilly do before those AT&T commercials?). There are even AI-powered broadcast anchors on television, like China’s Xin Xiaowei.
For local newsrooms, these kinds of advances in AI are coming from many directions. The great benefit is that the data helps newsroom leaders make better, more informed decisions. They can also spend more time on creating great newscasts instead of slogging through limited and late-arriving data.
We have come a long way from waiting a month to find out whether anyone wrote down that they watched.
Tim Wolff is Vice President of TV and Digital Publishing Innovation at Futuri. He has 20+ years of experience as a digital and broadcasting leader who’s led top-performing teams across the country at companies including Gannet, Belo, and Cox Media Group Ohio, which includes three daily newspapers, three radio stations, WHIO-TV, and more. Wolff, who holds a Master’s in Journalism from the University of Missouri, also makes a mean green chile stew.