March 6, 2023 | Artificial Intelligence, Registration
ISOJ panelists to explore use and impact of generative artificial intelligence on journalism
For journalists, now is the time to experiment with generative artificial intelligence. Gain first-hand experience, determine the possibilities, find the guardrails.
That’s according to Marc Lavallee, director of technology product and strategy for the journalism program at the Knight Foundation.
“A big part of what’s happened in the last couple of months is that there is now this massive opportunity for hands-on experimentation,” he said. “These underlying AI models have existed for some period of time, but the idea that anybody can go to a website and put words into a box and get a response has really opened the doors to learning and experimentation.”
However, Lavallee clarifies that this experimentation should be safe and low-risk.
“This is not a call to publish a bunch of semi-garbled prose from these tools. There are a bunch of other low-risk use cases to explore like summarizing text and suggesting tweets,” he said. “Journalists can use this as a kind of ‘co-pilot.’ The creators of these tools don’t know all the ways they can be useful so we are in this period of massive co-discovery. We get to figure out how they’re useful to us.”
Lavallee and an esteemed panel will explore the possibilities of artificial intelligence tools for journalism in the opening panel at the 24th International Symposium on Online Journalism (ISOJ).
“How can journalism incorporate AI, including generative tools like ChatGPT and Bard, to improve production and distribution of news,” will take place on April 14, 2023, virtually and in person at the University of Texas at Austin. Registration is now open for ISOJ 2023.
Lavallee will moderate a discussion with Jeremy Gilbert, Knight Chair in Digital Media Strategy for Northwestern University’s Medill School; Sam Han, director of AI/ML and Zeus Technology at The Washington Post; Aimee Rinehart, program manager for the Associated Press’ Local News AI initiative; and Sisi Wei, editor-in-chief at The Markup.
So, what is generative AI anyway?
To find an answer, I decided to go to one of the sources: OpenAI’s ChatGPT.
“Generative artificial intelligence (AI) is a type of machine learning that enables computers to create new content or generate new information, such as text, images, or audio…
Generative AI typically involves the use of deep neural networks, which are algorithms that are modeled after the structure of the human brain. These networks are trained on large amounts of data and are capable of learning complex patterns and relationships within that data. Once trained, they can be used to generate new content by making predictions based on the patterns they have learned.”
People are using generative AI to write articles, essays or social media copy, and to generate images or videos.
“I’m really interested in the creative ways journalists use generative AI to help us investigate different (and other) types of technology,” Wei said.
For tools like ChatGPT, Google’s Bard, or Microsoft’s AI-powered Bing, users input text and the tool will respond in a conversational format. Users can give feedback, which will help to refine the tool.
Then there’s OpenAI’s DALL-E, in which a user types a prompt and the tool will generate an image based on that text.
“New generative AI tools give journalists the opportunity and the obligation to rethink how we approach our profession,” Gilbert said. “For too long journalists have described our work as ‘mass media.’ Why? Because that’s what the technology allows. Generative AI should enable us to pivot away from ‘one to many’ and toward ‘one to one, for many.’ This new model will let us serve each news consumer with the customized information they need at the time and in the form they need.”
This personalization of news is something Han also sees opportunities for.
“Generative AI allows articles to be summarized differently based on user segments, preferences, location, device, etc.,” he said. “Generative AI can generate different article summaries based on the user’s familiarity with the topic of an article. For example, for an article about conflict in Ukraine, a summary for a reader with good knowledge of the event will be different from that of readers with less exposure to the topic.”
Han, who is an artificial intelligence and machine learning practitioner, also sees opportunities for user engagement and translation of articles.
“This will truly open up international audiences for journalists,” he said.
Research by Rinehart and colleague Ernest Kung points to the smallest newsrooms as the biggest beneficiaries of generative AI.
“A tool like ChatGPT could help these small newsrooms produce marketing materials, social media posts and summarizations, just to name a few benefits,” she explained. “Of course, none of these items should be published without review, but just having that half-set of extra hands could be beneficial to very small newsrooms.”
However, generative AI comes with its own limitations and concerns, as it will tell you itself.
“As with any new technology, there are also ethical and social considerations that need to be taken into account, such as the potential for AI-generated content to be used for misinformation or to replace human creativity and labor,” ChatGPT told me.
Whether it’s professors or journalists sounding alarm bells about plagiarism, or researchers warning about the spread of disinformation, there are real questions about what the use of generative technologies might mean for communication.
“There are loads of risks and caveats involved with using tools that are powered by AI,” Rinehart said. “At the top of the list is the black box of the AI itself – even the developer teams on these technologies cannot explain how the AI arrives as the conclusions or renderings that it does. That is a big problem for journalism.”
“For now, the best use of these tools is for sorting, summarizing, translation, A/B headline testing and the like, with always a human-in-the-loop before publishing anything,” she added.
And as with many other technological advances, generative AI has some journalists worrying what the impact might be to their own jobs.
“The best thing for any individual to do, and I think we need to really think through this as an industry, is really slicing, in a very fine-tuned way, the difference between things that are skills worth continuing to develop and things that are skills worth finding ways to delegate,” Lavallee said.
Think reaching for a calculator instead of attempting long division, or using your phone to remember your friends’ numbers.
But, when it comes to making sense of all the information we’re being bombarded with, the role of journalists is still necessary.
“It’s a ton of change, but when it comes to overall jobs gained or lost, I’m not sold on the idea that this is going to be yet another catalyst for decline in the number of people doing this work,” Lavalle said. “If anything, I could see it going up because what I think we’re starting to see is the idea that a journalistic process applied to information has a value to consumers, I think is a widening phenomenon.”
For now, those involved in the tech side of the industry are encouraging journalists to experiment and learn as much as they can about the tools.
“Generative AI is only going to improve,” Rinehart said. “So, it’s incumbent upon all newsrooms that want to be here in three years’ time to understand how the technology works and determine what to use and what to avoid.”