Lancaster professor delivers warning at UK Supreme Court panel


Image shows a curved bench in the UK Supreme Court with wood panelled walls and stone archway with the four panellists sitting in black high-backed chairs. Claire Hardaker (seated second from right) is speaking as other panellists look on. © UK Supreme Court
Professor Claire Hardaker (second from right) addresses the panel discussion at the UK Supreme Court

A Lancaster University professor, renowned for her language detection work, has warned that large language models and deepfake technologies create new routes for misinformation, bias, and strategic manipulation.

Professor of Forensic Linguistics Claire Hardaker was speaking at a panel discussion at the UK Supreme Court on the importance of trust.

She joined the panel to debate ‘Why Trust Matters: Public confidence in the rule of law in an age of AI and populism’.

Chaired by Dr Hannah White, the Director and CEO of the Institute for Government, a leading independent UK think tank, the panel also included the President of the Supreme Court Lord Reed and Ipsos Research Director Daniel Cameron.

From left to right: President of the Supreme Court, Lord Reed , Dr Hannah White, the Director and CEO of the Institute for Government, a leading independent UK think tank, Dr Claire Hardaker and Ipsos Research Director Daniel Cameron.

The discussion examined how artificial intelligence could both strengthen and destabilise democratic institutions.

Drawing on her expertise in deceptive and manipulative language, Professor Hardaker also said: “While AI offers efficiency and analytical power, uncritical deployment, particularly in legal and institutional contexts, risks eroding the nuanced human judgement and empathy that underpin justice.”

Her contribution speaks directly to the mission of Lancaster’s Data Science and AI Institute (DSAIL), where she leads the new ‘Integrity’ theme.

This theme addresses how trust, transparency, fairness, and accountability can be sustained in rapidly evolving sociotechnical systems.

By interrogating the gap between technological capability and ethical oversight, said Professor Hardaker, ‘Integrity’ provides a framework for ensuring that AI enhances, rather than corrodes, the fragile trust on which the rule of law depends.

The discussion included law students (in person) from various universities, and 250 people who watched the event live on the Court’s website.

Back to News