AP Issues AI Guidelines for News Reporting and Content Creation
Brief news summary
The Associated Press (AP) has established stringent policies against AI-generated content while promoting AI literacy among its staff. This initiative reflects a larger movement in news organizations to ensure responsible AI use, highlighted by a specialized chapter on AI reporting in their Stylebook and a comprehensive glossary. Amanda Barrett, AP's vice president of news standards, emphasizes the importance of cautious experimentation paired with safeguards. Due to the rapid evolution of AI technology, the Poynter Institute urges media outlets to clearly articulate their AI usage policies. Although generative AI can produce a wide range of content, it frequently struggles to distinguish between fact and fiction. As a result, AP applies rigorous scrutiny to AI-generated materials, permitting their use only when they meaningfully enhance a story. Other platforms, such as Wired and Insider, exclusively utilize content created by journalists to preserve accuracy and trustworthiness. While AI can support non-publishing tasks, AP's guidelines reaffirm its commitment to journalistic integrity amidst ongoing worries about job security in the changing media environment.The Associated Press (AP) has released guidelines regarding artificial intelligence, stating that AI tools must not be used to produce publishable content and images for the news agency, while also encouraging employees to familiarize themselves with the technology. AP is among a select group of news organizations beginning to establish protocols for incorporating rapidly evolving technological tools like ChatGPT into their operations. On Thursday, the service will launch a chapter in its influential Stylebook, which will guide journalists on how to cover these developments and include a glossary of relevant terminology. "We aim to provide a solid understanding of how we can experiment with this technology safely, " remarked Amanda Barrett, vice president of news standards and inclusion at AP. In a statement regarding what it calls a "transformational moment, " the Poynter Institute has urged news organizations to develop standards for AI usage this spring and to communicate these policies to their audience. Generative AI can produce text, images, audio, and video on demand, but it is not yet fully capable of differentiating between truth and fabrication. Consequently, AP stated that any material generated by artificial intelligence requires thorough vetting, akin to content from other news sources. Additionally, AP emphasized that AI-generated photos, videos, or audio pieces should not be utilized unless they are the focus of a story. This stance aligns with tech magazine Wired, which has declared that it does not publish AI-generated articles, "unless the fact that it is AI-generated is the crux of the story. " "All stories must be entirely authored by you, " stated Nicholas Carlson, editor-in-chief of Insider, in a communication to staff that was made public. "You bear the responsibility for the accuracy, fairness, originality, and quality of every word in your articles. " Prominent instances of AI-generated "hallucinations, " where incorrect facts are fabricated, underscore the necessity for consumers to be assured that standards exist to "ensure the content they read, watch, and listen to is verified, credible, and as fair as possible, " Poynter remarked in an editorial. News organizations are exploring ways in which generative AI can be beneficial without leading to publication. For instance, AP could use AI to compile digests of ongoing stories for its subscribers.
Additionally, it might assist editors in crafting headlines or generating story ideas, according to Wired. Carlson suggested that AI could offer possible edits for improved clarity and readability or generate interview questions. AP has been experimenting with simpler AI applications for a decade, utilizing them to produce concise news articles from sports scores or corporate earnings reports. "This experience is invaluable, " Barrett acknowledged, "but we are committed to proceeding with caution in this new phase to safeguard our journalism and maintain our credibility. " Recently, OpenAI, the creator of ChatGPT, and the Associated Press signed a deal allowing the AI firm to license AP's archive of news stories for training purposes. There are concerns among news organizations about their content being used by AI companies without proper authorization or compensation. The News Media Alliance, representing hundreds of publishers, has issued a set of principles aimed at safeguarding its members' intellectual property rights. Some journalists worry that artificial intelligence could ultimately replace human jobs, a significant concern during ongoing contract discussions between AP and its union, the News Media Guild. Vin Cherwoo, the union's president, noted that they have not yet fully assessed the implications of these developments. "There are provisions we find encouraging, but we have questions regarding others, " Cherwoo stated. With proper safeguards established, AP encourages its journalists to learn about the technology, understanding that they will need to report on these issues in the future, Barrett remarked.
Watch video about
AP Issues AI Guidelines for News Reporting and Content Creation
Try our premium solution and start getting clients — at no cost to you