The Chancellor doesn’t use AI
Betting Britain's future on a technology they've never tried
It was the final question in Rachel Reeves’ conversation with Mumsnet’s Justine Roberts:
Roberts: “Which is your favourite AI chatbot? What do you use of choice?”
Reeves: “I don’t use anything.”
Roberts: “None of the LLMs?”
Reeves: “No.”
The interview comes a month after Reeves’ Mais Lecture, where she called AI “the defining technology of our era” - pledging that Britain would achieve the fastest rate of AI adoption in the G7, backed by a £2.5bn support package.
Reeves isn’t using the same technology she’s staking Britain’s economic future on. It’s like a Health Secretary lauding the NHS while never having been 56th in the GP phone queue.
Liz Kendall, the Science, Innovation and Technology Secretary, was no more reassuring. In the run-up to last week’s launch of the government’s new £500m Sovereign AI Fund, she was interviewed by the BBC’s Matt Chorley.
Riding in the back of an autonomous taxi, she said:
“Uh well, I use AI personally rather than at work. I’ve got to be honest. I’m much more likely to to use it in my my personal life.”
Liz Kendall, Science, Innovation and Technology Secretary
Kendall’s best example use case was turning to the technology to help decipher the ingredients list on the back of a cosmetics bottle to treat an allergic reaction. A good application, in a pinch. Hardly a blueprint for the Fourth Industrial Revolution. An irony that Chorley himself couldn’t help picking up on:
“You’re the government advocate for AI and you’re saying that you don’t use it at work.”
Matt Chorley, BBC
There is a counterargument: that AI is cognitively corrosive and by avoiding it, Reeves, Kendall and other government ministers are preserving their independence of thought.
As a personal stance, it is defensible. In the Chancellor’s case, who believes AI has the “potential to transform productivity across the entire economy”, I’m not sure it passes muster. The same could be said for historian-turned-innovation-chief, Kendall.
Freedom of information (FOI) exposure is likely the main fear holding politicians back.
This follows the precedent set in the case of Peter Kyle - Kendall’s predecessor who was the subject of a request filed by the New Scientist. Despite initially resisting, a transcript was released, establishing that these interactions are subject to the same transparency laws as emails or WhatsApp messages if used for government business.
Kyle remains the only politician to have had their AI interaction logs released through FOI. The government has since successfully blocked similar requests by arguing that they would place a “grossly oppressive burden” on staff to find and redact the data.
As I’ve said previously, conversations with AI systems should be afforded the same privilege as speaking with a doctor or lawyer. It is on these grounds that I understand the trepidation those in government feel; they should have the freedom to privately use these systems.
This fear should not outweigh our politicians adopting the technology, especially when it is increasingly the cornerstone of government policy. They cannot escape the fact that AI is already altering how we live and work, and in time how we govern.
Government advisers will have been tasked with preparing reports, bringing in speakers, and modelling its impact to educate ministers. But no number of briefings, looking from the outside in, can substitute spending time using the technology.
They are missing what these systems now are becoming: not just models that answer questions, but tools that can search, analyse, write, code and, within bounds, act on a user’s behalf across the web, in software, and through the computer interface itself.
You must use it, to understand it.
Working with it, they would also find its edges: where it still hallucinates, trips up, and where it simply stops. They would hit its limits. A Chancellor who has never been cut off by a Claude usage cap has no instinct for what compute scarcity means - and she is the one allocating public funding for how much of it Britain should build.
I find it odd that our political leaders, like Reeves, aren’t the least bit inquisitive. Not only to be personally behind the technology curve, but to have barely a sliver of experience of what it feels like to be on it. It’s even more jarring given their policy statements, let alone how exposed the economy they steward is to its effects.
By insulating themselves from AI, they are avoiding an experience that would make their advocacy more honest. Their words instead ring hollow, doing a disservice to what is being built in the UK. The government’s £500m Sovereign AI Fund. Commitments from Anthropic and OpenAI to expand their UK presence. DeepMind - one of the world’s frontier labs - headquartered twenty minutes from Westminster, choosing London as their base rather than relocating to the Valley.
”We’re a little bit out of it here, but I think it’s very conducive to thinking deeply about things, being more original about how you think.”
Demis Hassabis, CEO of Google’s DeepMind
Hassabis gets to the crux of what could define Britain’s role in the AI transition.
We do not have the financial or operational might of the US, the current epicentre of AI. Our strength must be in our intellectual independence: the confidence to be a little off-centre, to be curious, to have the licence to be contrarian, and to follow a thought without knowing immediately where it leads.
Our political leaders must channel this same disposition. Applying this to how they govern, but also how they engage with the tools reshaping the country they lead.
Try it. Then speak openly about the experience - the surprises, the limitations, and the moments it changed how you thought about something. Because everyone else is already doing exactly that: finding their way with an intelligence none of us fully understand yet. Whether you’re a receptionist in the back office of a Stoke law firm, or a researcher on the DeepMind team building it at source.
We are all, in our different ways, figuring this out together. Our politicians should be too.
“Maybe that’s where I’m going wrong.”
Rachel Reeves, Chancellor of the Exchequer




