Artificial intelligence tools are increasingly being used to generate planning objection letters. From a Chartered Town Planner’s perspective, the results are concerning — for objectors, for councils, and for the integrity of the planning system itself.
Over the past two years, a new category of planning representation has emerged on council portals across the UK. Letters that are lengthy, grammatically polished, and dense with apparent policy references — but that on closer inspection reveal a troubling pattern. The case law cited does not exist. The policies quoted are from the wrong local authority. The site-specific analysis bears no relationship to the actual proposal.
These are AI-generated planning objections, and they are now appearing in significant volumes. Research published in early 2026 found that nearly nine in ten local planning authorities reported receiving representations they believed to be AI-authored, with the majority saying the problem is growing. In one widely reported case, over five thousand objections were submitted against a single scheme — many near-identical, many raising arguments that had no connection to the site or the application before the council.
Several commercial services have emerged that offer to generate objection letters using large language models for fees as low as £45. The pitch is appealing: a professional-sounding letter in minutes, at a fraction of the cost of instructing a qualified planner. But the product these services deliver is fundamentally different from a professionally researched objection — and the consequences of relying on one can be serious.
A planning objection is not a piece of persuasive writing. It is a technical document that must do something very specific: demonstrate that a proposed development conflicts with the policies of the adopted development plan and would cause material planning harm. Doing this requires research that AI tools are simply not equipped to carry out.
The result is a letter that reads well on the surface but collapses under any professional scrutiny. Planning officers recognise these submissions immediately. They add to the officer’s workload — because every representation must be read and assessed — but they contribute nothing to the material planning analysis of the case.
AI language models do not understand truth. They predict the next plausible word in a sequence based on patterns in their training data. When asked to cite planning policy or case law, they produce text that looks like a real reference but may be entirely fabricated.
We have reviewed AI-generated objection letters submitted to councils where we were also acting on the same case. The pattern is consistent:
A planning officer who identifies fabricated references will not simply ignore them. The letter’s credibility is destroyed. Every argument in it — including any that might have been valid — is tainted by the false material. The objector has spent money and time on a document that actively undermines their case.
The Planning Inspectorate (PINS) has taken a clear position on the use of AI in planning casework. In guidance published on GOV.UK, PINS now requires that any party using AI to create or substantially change any part of a submission must disclose this. The disclosure must identify the AI tool used, what it was used for, and what verification steps were taken to ensure accuracy.
The guidance contains a direct warning: the Inspectorate states that improper use of AI could be treated as unreasonable behaviour — and that parties who submit AI-generated material that adds unnecessary burden to a case are at risk of a costs award.
The Inspectorate’s position is clear: AI can be used as a tool to assist with formatting, translation, or accessibility. But the person submitting the evidence bears full responsibility for its accuracy. The “golden rule,” as PINS describes it, is that you must use AI responsibly and ensure that everything it generates is accurate and appropriate.
In one of the most striking examples to date, Orkney Islands Council launched an investigation after fake AI-generated objection letters were submitted against a proposed hotel development in Kirkwall. The letters used the names and addresses of real local residents and businesses — without their knowledge or consent. The developer recognised the impersonated individuals personally and alerted the council.
The council removed the fraudulent submissions from its planning portal and opened an investigation. The incident is believed to be among the first of its kind since the introduction of digital planning portals, and it raised serious questions about identity verification, the integrity of online consultation, and the potential for AI tools to be weaponised against legitimate development.
While this is an extreme case, it illustrates a broader point: AI-generated objections do not just risk being ineffective. They risk actively harming the planning process and the people who depend on it.
Planning officers read objection letters for a living. They are trained to assess representations against the development plan and to identify which arguments raise genuine material planning considerations. AI-generated letters share characteristics that are immediately recognisable to an experienced officer:
The irony is that a shorter, rougher, genuinely personal letter from a neighbour who describes the actual impact on their actual home carries more weight than a five-page AI essay that could have been written about any development anywhere.
An effective planning objection is the product of research, professional knowledge, and site-specific analysis. It cannot be generated in minutes because the work that underpins it — reading the application, identifying the policy conflicts, assessing the site context, researching comparable decisions — takes time and expertise.
When Planning Voice prepares an objection letter, the process involves:
The result is a letter that the planning officer must engage with substantively — because it speaks their professional language, references the correct policies, and demonstrates specific, evidence-based harm.
None of this means AI has no role in planning. AI tools can be genuinely useful for tasks like translating documents, improving the accessibility of written material, summarising lengthy technical reports, or helping people articulate concerns they struggle to put into words. The Planning Inspectorate recognises these legitimate uses.
The problem arises when AI is asked to do the planner’s job — to research policy, assess harm, cite case law, and produce a technical document that will be relied upon in a quasi-judicial decision-making process. That is professional work that requires professional judgement, accountability, and accuracy. An AI model has none of these things.
If you use AI to help draft your personal comments on an application — to organise your thoughts or improve your phrasing — that is a reasonable use, provided the underlying concerns are your own and the facts are accurate. But if you are relying on an AI service to produce a standalone objection letter that you submit as your representation, you should understand what you are actually getting: a document that looks professional but has not been professionally prepared.
Planning objections succeed or fail on the strength of their planning arguments — not on how polished the prose sounds. A well-researched letter grounded in the correct Local Plan policies, supported by real evidence and genuine case law, written by someone who has actually read the application and understands the site, will always outperform an AI-generated letter that has done none of these things.
If your home, your neighbourhood, or your quality of life is at stake, the question is not whether you can get a cheaper letter from an AI tool. The question is whether you can afford to submit one that does not work.
Planning Voice prepares every objection letter through individual research by a Chartered Town Planner (MRTPI). No AI-generated text. No fabricated references. No risk of costs awards. Just thorough, policy-based arguments that councils take seriously.
Get Free Assessment → or call 01157 365085
Or call: 01157 365085