Journalist Exposes AI-Generated Content Scandal at Cody Enterprise
Brief news summary
Wyoming reporter CJ Baker noticed something was off when he read quotes from Wyoming’s governor and a local prosecutor that seemed robotic and artificial. He later discovered that another journalist was using generative artificial intelligence (AI) to help write his stories. The use of AI in journalism raises potential pitfalls and dangers, as chatbots can produce plausible yet spurious articles with minimal input. Although AI has found a role in journalism for certain tasks, many newsrooms, including The Associated Press, do not allow their staff to use generative AI for creating publishable content. Transparency about the use of AI is crucial to maintaining journalistic integrity.A reporter from the Powell Tribune discovered that a fellow journalist from a competing news outlet had been using generative artificial intelligence (AI) to write stories. The Cody Enterprise, founded in 1899, published articles with AI-generated content, including fake quotes. The reporter who uncovered the scandal met with the AI user and the newspaper's editor, who apologized for the incident and promised it wouldn't happen again. While AI has found a role in journalism, as it can automate certain tasks, there are concerns about its potential pitfalls and dangers to the industry.
The Associated Press, for example, uses AI for tasks such as translating stories and writing sports reports, but journalists are not permitted to use it to create content. Being transparent about the use of AI is crucial, as demonstrated by Sports Illustrated's previous controversy where AI-generated product reviews were presented as being written by non-existent reporters. The Cody Enterprise has now implemented a system to identify AI-generated stories and is working on a policy regarding their use.
Watch video about
Journalist Exposes AI-Generated Content Scandal at Cody Enterprise
Try our premium solution and start getting clients — at no cost to you