
I believe communication is not about noise but about unlocking business outcomes. Looking at the shiny new toy that’s become super-integrated in our day-to-day routines, it’s imperative we think carefully about using Artificial Intelligence (AI) to make our work more robust for business impact but also being thoughtful of what we must not delegate to AI, to remain true and relevant to our calling. The key is to remember that trust is a long game and moving up or down the trust spectrum has long-lasting impact and trust lost takes long to rebuild. I view AI in PR with optimism, but also with a clear-eyed understanding of its limits, and the responsibilities it places on us.
Research: Ask better questions & make your own decisions
AI has already become indispensable is research. What took hours or even days of scanning through analyst notes, media trends, policy documents and competitor landscape can now be synthesized within minutes with the right prompt. But the key to effectiveness is in the questions, not the tool. AI can pull up the information; it cannot tell you what matters. Nor can it place the data in right context to accurately understand sentiment shifts. For instance, while researching for a startup client, I got tempted to ask the tool about positioning. It gave some good ideas; then suggested positioning to investors as a “Byju’s for x category”! Sometimes, the gaffe is not even this obvious. So the key is to keep asking for data & more specific facts to form the insights, because while AI helps us see faster, we must still decide what drives outcomes.
Monitoring & Measurement: The power of early patterns and the discipline of interpretation
Clearly this is where we are seeing the biggest shift as AI tools today allow us to read millions of conversations across languages, platforms, and formats to find patterns quickly. But the communicator’s role is not just to observe patterns but also to interpret them for the stakeholders and what it means for them: for the investors, customers, employees, regulators etc. AI cannot sift through the complexity or decide which stakeholder concern must be prioritized; which spike matters versus what’s just raucous noise? We have all been burned by social media enough to learn the importance of knowing when to stay silent even in face of massive trolling. Or sometimes, an issue may not need a social media response but just a quiet word with the regulators. This judgement about when to intervene, which stakeholder needs a reassurance, differentiates human communicators from AI tools. Even when one uses AI to sharpen their pitches or tighten the language, it’s the human mind that understands people, power dynamics and geo-political complexity.
Content & Leadership Voice: AI can support clarity, not conviction
There is also a growing use case for AI in leadership communication and content creation. Many CXOs (or their comms teams) use it for talk points for a media announcement or an investor meeting, or even to simulate responses for a tough internal conversation. But the personal voice or the lived experience cannot be algorithmized.
Moreover, AI-generated content runs the risk of sameness in narratives and summarized scripts. Without the personal stories to make it real, these messages will remain shallow and it’s hard to be authentic through algorithmic fluency. I am already bored with how so many streaming thrillers default to the cold-open scene template, (often now morphed to cold-open murder scene), making it an increasingly predictable hook.
Guarding against hallucinations, fact-checking and the discipline of corroboration
Yes, AI will hallucinate with full confidence. It creates quotes, dates, numbers and sources that sound plausible, but are not real. But in PR, where credibility and reputation are at the heart of what we do, accuracy is not optional. When using AI for research, I typically use more than one tool to cross-corroborate and also insist they provide citation for all the data and facts.
Moreover, AI carries forward many existing biases around gender, region, class, language, representation. So communicators must scrutinize all output for implicit (or sometimes explicit) bias and audit the tonality.
I am also convinced that as reputation custodians, we must work with stakeholders for stringent disclosure. There are hard questions that we must face – when must brands reveal content is AI-assisted? Should agencies disclose to clients when AI shaped a narrative? There are no universal rules yet, but I would strongly recommend over-indexing on transparency.
And finally, deepfakes and misinformation are a reputational threat every organization must plan for. A fake video, a fabricated quote, or a manipulated image can trend before you even issue a statement. A response playbook readiness has never been more important. In India, this balance becomes even more critical because with our multilingual reality, vast digital diversity, and the speed at which misinformation spreads, responsible AI use rather than just enthusiastic adoption is mandatory.
The Limits of AI: What it will not replace
In a way, AI may increasingly become the touchstone to separate true practitioners from those treading water. At the heart of PR is understanding business, recognizing key audiences and the most relevant messages and using these to build relationships and trust that influence business outcomes. At least AI as we know it now cannot replace this real craft of PR – from building relationships to decoding cultural nuances or understanding boardroom power dynamics. Screwball humour, a new unexpected way or telling a story, sharp reading of the room – these are some ways that anchor us to remaining human and what AI cannot replace.
The future may be AI-enabled to scale for complexity but organizations must build AI-readiness, both for the agencies as much for the clients, with clear governance policies and team training to guard against security and privacy concerns. AI will get faster and smarter. But the heart of communication — trust, credibility, empathy, context — remains deeply, stubbornly human.

