These aren’t the systems you’re looking for: SCC academics developing new way to protect industrial control systems


A programmable logic controller

Dr Sam Maesschalck, Professor Nick Race, and Dr Vasileios Giotsas of the School of Computing and Communications, alongside PhD student Will Fantom, have recently published a paper showcasing how components of an industrial control system (ICS) can be reprogrammed to mimic a poorly-made version of a decoy, known as a “honeypot” to protect systems used to control industrial processes within industries such as nuclear, water, and electricity.

Honeypots are widely used within the security world, and traditionally are deployed to attract attackers to them. They collect valuable threat intelligence for security services whilst distracting attackers from the system they initially targeted. To lure potential attackers towards them, honeypots need to look and behave as realistically as possible, such that they are indistinguishable from real systems. This can be a significant challenge, as in recent years, skilled attackers have become adept at fingerprinting and avoiding honeypots.

But what if we didn’t have to do this? The team at Lancaster have taken a fundamentally different approach to this challenge. Rather than investing a great deal of time and resource into creating a believable honeypot that mimics a real system, they instead use it as a deterrent. By making the real industrial control system appear as a crude honeypot, it becomes a less attractive target, with potential attackers then steered away from interacting with it.

To test the effectiveness of the disguise, Dr Maesschalck and the team ran a range of penetration tests, as well as inviting some experts in cyber security to attempt to infiltrate their test system. They found that the disguised system appeared to the penetration tests to be one system (as opposed to one system cloaked by another), and that the use of the honeypot disguise didn’t impede the ICS with carrying out its usual tasks. When they asked the experts to attempt to infiltrate the system, they were unable to identify two systems were running, and were reluctant to attempt an attack on it, since it appeared to them to be a honeypot.

Speaking on the findings of this study, Dr Maesschalck remarked: “Cyber security must constantly evolve, especially with honeypots, which need continuous improvement to effectively deter attackers without deterring them. The challenge lies in deploying honeypots convincingly, a task that usually demands significant resources and expertise since obvious honeypots can deter attackers, diminishing their utility. Our approach innovates by making real systems appear as honeypots, thus deterring attackers without requiring extensive resources typically required for realistic honeypots. This strategy redefines honeypot use and highlights the importance of psychological tactics in cyber security, aiming to discourage attackers by manipulating their perceptions.”

Professor Race added: “Our research has shown how, through obfuscation, we can reduce an attackers’ appetite to interact with real systems. In transforming honeypots from threat intelligence collectors to active participants in the defence of operational systems, the research highlights the potential role that honeypots and other deception techniques can play in helping to keep critical infrastructure safe.”

Back to News