View All SL News
Open Positions at SL
View all Open Positions at SL
View all Funding Calls
Centre for Research and Evidence on Security Threats: 2021 Commissioning Call - Closing date Wednesday 5th May 2021 16:00
ISCF Digital Security by Design – business led demonstrators phase 1 EOI - Closing date Wednesday 26th May 2021 11:00am
View all Publications
Security Lancaster is pleased to announce the first annual outstanding Security Research Award. The award is for students undertaking a security and protection science-focused, individual project as part of their studies. This work could be multidisciplinary or in a single discipline but should demonstrate excellence and originality in the identified field. Deadlines: Undergraduate deadline: 28th June Postgraduate deadline: 1st November PhD deadline: 1st November.
Welcome to Security Lancaster
The SSS-H (Socio-Technical Systems Security Hub), supported by CREST and ACE-CSR, targets "useful and usable" Cyber Security with a unique data-centric socio-technical "systems" perspective.
We research Societal/Behavioral/Sociological threats, Legal aspects, SW, Distributed Systems, Networks and Digital Forensics across varied domains. Our emergent research targets Secure ML, technology-invariant data integrity, quantifiable/composable security alongside a community-accessible threats data repository TIDE-H (Threat Intelligence Data Exchange Hub)
Lancaster University recognised for excellence in cyber security education
Lancaster University is one of eight trailblazing universities to have become the first in the UK to gain recognition for their commitment to cyber security education in a new initiative from the National Cyber Security Centre (NCSC).
UK hub for research into security threats awarded £5.3m funding
The Centre for Research and Evidence on Security Threats (CREST) has been awarded £5.3m by the Economic and Social Research Council (ESRC), part of UK Research and Innovation, to produce new behavioural and social science research into security threats to the UK.
£3M awarded to interdisciplinary project to improve socio-technical resilience and trustworthiness of autonomous systems
A project that will improve the ability of autonomous systems to reason about the impact of their decisions and actions on technical and social requirements and rules has been awarded almost £3M from the UKRI Trustworthy Autonomous Systems (TAS) programme.