Social and Technical Objects for Resilience and Cyber-security (STORC)
GCHQ / Lancaster University ACE-CSR Studentship Grant
This project is concerned less with the strategies of attackers and more with the actions of benign agents that undermine the risk controls intended to forestall the attackers. This is (perhaps) a neglected area. At some point all technical systems are dependent for their security on a social system. They rely on acceptable use policies, authorisation levels, user roles, passwords, rules about choosing, protecting and (not) sharing passwords, rules about non-disclosure, rules about creating fraudulent data, rules not to leave systems logged on, rules for reporting suspicious activity and so on. These are all social objects, and only function - as rules for instance - because there is some collective intention that they function as such. And this collective intention sometimes fails, typically for reasons that are socially adaptive. A group of users may agree to leave a terminal logged on, for example, when their work requires urgent responses and logging back on is time consuming. A group of organizations may have to suspend rules about disclosure when needing to act in a crisis.
This PhD project is designed to explore how can we reason about joint technical controls and social controls, and thereby 1) make an integrated assessment of risk to the socio-technical system as a whole, and 2) design the system as a whole in a self-consistent, plausible way.
Since the 1990s, ideas about distributed cognition have helped us understand how information processing in distributed systems is a joint accomplishment of technical and social components. We have also understood how those developing the technical components often resort to social objects to protect the system from social activity that cannot be somehow precluded, and how such social objects commonly fail in this. The aim of the project would be to build on this idea to produce a complete formalism, and a series of mechanisms for using it in the design and configuration of a secure system. In particular, we might want to develop mechanisms that can turn changes to general security strategies into changes to specific policies and social objects; or that can learn from collective behaviour about the effectiveness of social objects and make corresponding adaptations to technical controls.
The target case study or area of application will be in SCADA (Supervisory Control and Data Acquisition) systems or in critical infrastructures such as interdependent systems of SCADA and telecommunications (or Internet) networks where cascaded failures occur. In such systems, there is a mix of technical and social artefacts and actors that will yield a highly suitable vehicle of study for the proposed research. One of the key elements of the work will be to determine what balance of technical and social remediation steps to apply when particular challenges arise in such systems.
Formal application should be made via the Postgraduate Applications Portal, MyLancaster. More details can be found here.
Informal enquiries regarding the application process to the School of Computing and Communications may be made via email to the SCC PhD applications team