Samuel Len, head of AI Contents Team 2

Not long ago, a newsroom sounded like a kind of music — clattering keyboards, ringing phones, arguments over a lead. Today, in many digital outlets, that noise has faded to the hum of servers as large language models churn out articles with clinical efficiency.

We are witnessing, in real time, the industrialization of the written word.

Artificial intelligence (AI) is no longer confined to transcription or spell-checking. It has become a central architect of what might be called the commodity news ecosystem. As algorithms claim the terrain of the factual — the “what” and the “when” — a deeper question presses in: What becomes of journalism when its pulse is automated, and what, if anything, remains the province of the human reporter?

For years, organizations like The Associated Press and Reuters have used automation to cover corporate earnings, turning structured financial data into publishable briefs in seconds. But generative AI has expanded the scope from data processing to narrative construction. A 2024 study by the Reuters Institute for the Study of Journalism found that more than half of newsroom leaders were already deploying AI tools to improve efficiency, whether in drafting, editing or headline testing.

The benefits, at least on paper, are substantial. AI systems can scan vast troves of documents in seconds, flagging anomalies or patterns that might take a human reporter days to uncover. They can transcribe interviews with near-perfect accuracy, translate across languages almost instantly and personalize content for different audiences. In a media economy battered by shrinking advertising revenues, such tools promise to reduce costs while expanding output.

Yet efficiency, as a guiding principle, has its limits. When outlets like CNET and Sports Illustrated experimented with AI-generated stories in 2023 and 2024, the results were uneven. Some articles were serviceable; others contained factual errors, so-called “hallucinations.” What they lacked was not merely accuracy, but texture — the contextual judgment and cultural fluency that keep a story from reading like an annotated spreadsheet.

The reason is structural. These systems are trained on the past, predicting the next word based on statistical patterns in existing text. They are, by design, conservative. They do not discover so much as recombine. They cannot knock on a door, cultivate a source or recognize the significance of a silence in an interview.

If journalism is to endure as something more than automated summary, it may need to cede certain ground. The routine — earnings reports, weather updates, sports recaps — can be handled more quickly and cheaply by machines. The human advantage lies elsewhere, in the more elusive domains of “why” and “how.”

The survival of the craft rests upon three uniquely human pillars: skepticism, empathy and physical presence. While an algorithm can efficiently mine a public record, it remains incapable of the delicate persuasion required to interview a whistleblower immobilized by fear.

Source: Korea Times News