Today, large newsrooms are introducing a new technology, artificial intelligence (AI), to their work. Smaller newsrooms are interested in this tool too, even if they can’t implement it yet. Some predictions say that 90% of news will be written by AI by 2025 – in fact, you’ve likely already read a sports story or election rundown that was at least partially authored by an AI.
AI can be generally understood as any technology that simulates human intelligence: extracting patterns from data, predicting future events and/or adapting performance based on past mistakes. Not all AI is futuristic: transcription software, for example, uses AI to recognize and generate words from an audio file.
AI isn’t meant to replace the work of journalists. Instead, AI takes over repetitive, simple or data-intensive work so that human journalists can focus on stories that require creative insight, multifaceted analysis and good judgment.
In 2019, Polis, the London School of Economics’ media think tank, and the Google News Initiative partnered to create the JournalismAI initiative to promote the use of artificial intelligence among journalists. The JournalismAI Fellowship Program began this year, with the goal of innovating new tools that assist the work of journalists.
To learn more about how AI is influencing journalism, I interviewed initiative manager & team lead Mattia Peretti and fellowship program manager Lakshmi Sivadas on the fellowship, the initiative and what JournalismAI’s projects mean for the future of newsrooms.
A global network
The fellowship originates from a series of “Collab Challenges” that the JournalismAI staff held between 2020 and 2021. According to Peretti, the Collab Challenges arose “organically,” with no application process or formal organization for people interested in participating. Plenty of useful AI-based projects were completed during the challenges, many of which are still available online. The following year the process was formalized and altered to create the fellowship.
While the Journalism AI initiative is focused on educating journalists unfamiliar with artificial intelligence, the fellowship program goes a step further by fostering the skills of journalists already using AI technology in the newsroom.
“What we can do for them, through the fellowship, is connect them with a global network of people at the same level,” said Peretti. “By getting them to collaborate with each other, we can help them accelerate the adoption and implementation of AI, and show everyone in our community what’s possible.”
Forty-six different journalists were selected for the program. In total, 16 countries across six continents are represented in the cohort. With problems already emerging with AI developing racial and gender biases and racially profiling people of color, the JournalismAI staff heavily encouraged diversity when accepting fellows.
“Our idea was that if we bring in people who are representative of major populations around the world, they could recognize the kind of biases that exist in current data sets,” said Sivadas. “Then, in the systems that they are building or developing right now with the fellowship, they would be able to figure out where bias enters the development process, and mitigate that as well.”
Benefits of AI
The main goal of the fellowship is to create a software incorporating AI to benefit the teams’ newsrooms and newsrooms globally. Unlike OpenAI or Google’s DeepMind, whose research focuses on creating artificial general intelligence — software that functions as an independent human brain — JournalismAI’s projects are all tools that require the input or supervision of human journalists.
Most of these projects aim to assist with one of the three areas in news that the 2019 JournalismAI report outlined: gathering information, producing content or distributing the finished content to an audience.
Each of these areas has exciting potentials for journalism. Newsgathering AI can identify trends and monitor the mention of issues or events, and source information, for example by collecting and citing articles from various news outlets that all discuss the same issue. News production AIs, which work in content creation, can write bullet-pointed articles or reformat stories for different audiences in a fraction of the time it would take a human to do so. Finally, news distribution AIs take input from consumers to make news more impactful: finding likely audiences for an organization’s content, tracking readers’ behavior and personalizing newsflow so readers see what they’re most interested in.
“There is not one single journalism student that decided to take this career path because they were dying to sift through PDF documents day after day,” said Peretti. “That’s something machine learning does very well, and I think we should be excited that we can have the support of software doing all these things for us.”
Some of the mentors for the teams this year include Ines Montani, co-founder and CEO of the software company Explosion; David Caswell, former BBC News Labs product manager; and various members of the Knight Lab at Northwestern University. The mentors fill needs for fact-checking, advanced technical skill and more.
“We didn’t prepare a roster of mentors and tell [the fellows], ‘These are your mentors, work with them,’ because there would have been no point when we didn’t know yet what the teams would want to work on. So we tried to find subject matter experts that could help them for the specific case that they are exploring,” said Peretti. “We start from the needs of our teams.”
Ensuring responsibility
Ten projects are coming out of the fellowship this year. Among them are Attack Detector, which aims to detect hate speech towards journalists and environmental activists in Spanish and Portuguese, and Parrot, which identifies and measures the spread of state-manufactured media. These two, along with all the other projects, will be showcased at the JournalismAI festival in early December.
Peretti said that all of these projects are made with ethical AI use in mind. None of them are meant to run without human supervision, adding that it would be “extremely dangerous” to allow for unsupervised use at this time.
“The word we use again and again is ‘responsible’,” said Peretti. “I’m encouraged by what I’m seeing in the industry and I want to presume that a little bit of that is due to the work we do. But we need to continue to stress [responsible use of AI] if we really want AI to be a force of good for journalism.”
Sivadas believes that AI is becoming more prevalent in global newsrooms, and soon it will be inescapable. She quoted previous 2020 Collab Challenge participant Michaëla Cancela in saying, “You can either choose to be a part of the people who are making decisions about how it’s going to be used, or you can sit back and watch it destroy the systems and ethical practices that journalism was built on.”