Fort Vox: the little “voice box” with a big message
© Dr Lorraine Underwood and element14
What if you could break into the contents of a locked box with only the power of your voice? If that seems a little farfetched, the reality is that many security systems use “voice biometrics” – the sound of a person’s voice – as a key to unlock valuable content, including our money, healthcare records, and billing accounts. Some banks, for instance, authenticate a caller’s identity by comparing them saying, “My voice is my password” to a version they hold on file. If the two samples match well enough, the individual passes that verification task, and may even be granted full access to the account. But in this age of deepfakes and voice clones, should we be relying on these kinds of systems? How robust are they? And when is it too soon to start thinking about AI, security, and privacy?
That’s the magic of Fort Vox, a playful new prototype developed through a collaboration between Dr Lorraine Underwood and element14 on the one hand, and FACTOR’s forensic linguist, Professor Claire Hardaker and forensic speech scientist, Dr George Brown on the other.
Fort Vox is small, clever, and deliberately designed to get children and young people excited about how voices, technology, and security really work. Players are presented with a physical box that can only be unlocked by mimicking the voice password well enough. Like an inverse escape room, the prize is quite literally inside, and to get to it, players have to listen carefully, think critically, and experiment with different approaches until they’re able to bypass the voice recognition system. This encourages children to think about ideas that usually sit firmly in the grown-up world: how AI systems make decisions, how data like speech are analysed, and why privacy and security matter when our voices are used as biometrics.
Children are growing up in a swiftly evolving context of voice assistants and automated decision-making, and Fort Vox makes increasingly invisible technologies tangible. For teachers and parents, this aligns naturally with classroom discussions around computing, STEM, citizenship, and digital literacy. Playing doesn’t require specialist knowledge, but for the more adventurous and technically minded, the video and links provide a full set of instructions on how to build your own.
While Fort Vox is intentionally accessible, it also sits within a much bigger picture. The UK faces a well-documented skills gap in cyber security, AI, and digital forensics. The North West, with its growing concentration of universities, tech companies, and public-sector security partners, is increasingly recognised as a vital part of the national cyber landscape. Projects like Fort Vox help inspire and recruit to that future workforce long before university applications are on the table.
By introducing children to speech technology, AI, and security concepts early – and in a way that’s engaging rather than intimidating – Fort Vox contributes to a talent pipeline that starts with curiosity and leads, eventually, to skills that matter nationally. It reflects a broader commitment to developing expertise ethically and responsibly, with privacy and security baked in from the very beginning rather than bolted on later. At Lancaster University, FACTOR researchers working in forensic speech science, forensic linguistics, AI, and cyber security are deeply invested in that long game. Fort Vox speaks the same language: playful on the surface, serious underneath. However, none of this works without thoughtful design, and Fort Vox is a testament to what happens when creative technologists take education seriously.
Dr Lorraine Underwood and the team at element14 have created something that celebrates young people’s intelligence, trusts them with complex ideas, and invites them to explore without fear of getting things “wrong”. That combination of technical rigour, creativity, and openness is exactly what the next generation of security and AI professionals will need.
Back to News