By Sunil Saxena
Over the past 45 years, I have watched the media industry reinvent itself.
First, it was the death of Linotype. Then came cold type and bromides, but this phase was quickly replaced by desktop publishing. The birth of the internet and its rapid adoption, followed by the rise of social media in the 21st century, changed the way journalists and journalism worked.
Each wave reshaped workflows. Some cost jobs. But none of them fundamentally threatened the newsroom’s editorial authority.
The Key Difference: AI is different. It is not simply a better tool. It is a force that can quietly erode the human judgment sitting at the very heart of journalism.
The most uncomfortable risk is not that AI will publish errors at scale. The bigger risk is intellectual laziness. When journalists begin to rely on AI to locate stories, research topics, and polish copy, they may become more productive on paper while quietly losing the instincts that make them good at their craft.
We must understand that no matter how capable AI becomes at summarising, translating, writing headlines, or optimising content, the decision about what matters and why it matters cannot be delegated to an algorithm.
Values that Matter: Human judgement when it comes to news, integrity, and ethics must remain supreme. These three values must take precedence over all other considerations in any AI adoption framework.
Yet, I understand, newsrooms are rushing towards AI without putting the necessary guardrails in place. The danger is not only at the enterprise level, where common tools and workflows are adopted across teams.
The greater risk lies at the individual level, where employees use web-based AI tools outside the editors’ oversight, often sharing sensitive information in the process.
The Revenue Threat: We must also remember that the threat to media bottomlines is already arriving through a different door. AI chatbots are replacing search engines as the first stop for readers seeking information, and that shift is hitting website traffic directly.
Unlike social media, which eventually became a partner by sending organic traffic back to publishers, AI overviews offer neither traffic nor revenue.
Media houses must come together to license their content, because this is not a disruption they can individually absorb. It is an existential question for the industry.
I firmly believe journalism is fundamentally a value-driven profession. Tools may change, but values must remain the same.
Real Risk: At times, I feel that AI may not degrade journalism from the outside. It is that journalists may surrender their unique voice by blindly using AI, without the discipline to know when to trust their own judgement instead.
What do you think? Can newsrooms protect their editorial integrity while embracing AI, or are we heading towards a future in which AI quietly replaces human judgment?

