Home Office Grant (2018-2020) “Connecting Delayed pre-commitment with cyber awareness in order to address the perception gap and present bias”. Acting as PI. Amount awarded: £43,000.
This project, funded under the UK Home Office's ‘Understanding, preventing and responding to cyber-crime’ grant stream, investigated measures to improve cyber-security behaviour in small organisations. It is widely acknowledged that many small organisations are not behaving consistent with current National Cyber Security Centre (NCSC) guidance on best cyber practice. Indeed, in research conducted as part of this project we estimated that less than a quarter of micro and small businesses are implementing the five technical controls required by the flagship Cyber Essentials scheme.
The primary aim of our project was to explore the potential for cyber-security health-checks, coupled with a simple behavioural intervention (or ‘nudge’), to address this challenge by improving cyber-security behaviour in small organisations. During the course of the project we have offered cyber-security health-checks to small charities and businesses. This was done in collaboration with Cyber Protect officers. The health-checks were delivered by KITC solutions, a student-led IT consultancy at the University of Kent, and typically last one hour with real-time interaction (in person or remotely) between two consultants and a senior manager, owner or trustee of the organisation. The main objective of the health-check was to provide personalised feedback on current practice as well as to propose specific achievable actions the organisation could take to improve their cyber-security. The conversation was framed around the NCSC Small Business/Charity Guide but reflective of the organisations specific needs.
The over-riding aim was to empower organisations to develop the awareness and confidence to take control of their cyber-security so that they can make appropriate decisions now and in the future. Numerous cyber-security initiatives have been used in the past with limited success. It was vital, therefore, to question whether cyber-security health-checks result in behaviour change, and not just increased awareness. It was also vital to question how many organisations, and what type of organisations, would sign up for a health-check in the first place. During the course of the project we have identified a range of barriers that prevent organisations to act. Foremost amongst those is avoidance procrastination magnified by information overload as well as stress and anxiety around ‘what to do’. Consequently, organisations need very simple, tailored advice that can be directly translated into action. This basic principle was encapsulated in the health-checks and also the behavioural intervention that we have tested. In the intervention organisations were asked to write down, at the end of the health-check, their own plan of what they were going to do over the next three months. This plan for implementing change should be specific and actionable and represents a pre-commitment agreement with their future self. This intervention can be delivered independent of any cyber-security health-check.
EPSRC Standard Grant (2017-2020) “EconoMical, PsycHologicAl and Societal Impact of ranSomware (EMPHASIS)” (interdisciplinary, cross-university collaboration team lead by E.Boiten, De Montfort University). Amount awarded: £916,000.
This project, funded by the Engineering and Physical Sciences Research Council, focuses on the threat of ransomware. This specific strand of malware has become more prevalent in the last three years, when cybercriminals realised they could easily and quickly cash-in by holding citizens, SMEs, banks and critical infrastructure organisations (such as utility companies, police and hospitals) to ransom, often with the threat of data loss or data release (blackmail). At the same time ransomware has experienced a significant evolution, with the threat becoming increasingly complex and powerful while at the same time incorporating psychological and sociological tricks to increase the likelihood of victims complying. Such tricks include a countdown timer leading to a complete deletion of the victims’ data if they fail to comply, or threatening to embarrass the victim by threatening to release embarrassing information (real or not).
We advanced the knowledge and understanding of ransomware on a number of different but complementary dimensions: From the economic point of view, we studied how ransomware works as a business operation, what are the critical parameters for its success, where are the weak points and how we can use them to evaluate their associated risks and threat levels. Eventually to fight against them or at least limit their profitability. Profiling the cybercriminals and their victims, we explored whether we can profile the cybercriminals behind the development and exploitation of ransomware, and also their victims. This helps both the police and other law enforcement agencies to act against the cybercriminals more effectively. Victim profiling also helps us understand better what personality, psychological, economic and societal traits can predict a greater victimisation risk, and recommend strategies and targeted actions to reduce their exposure. We also aimed to have an impact on the perception of citizens, SMEs and other end-users of the increasing risks involved in falling prey to ransomware, and on how to provide an adequate response to those risks. To address this, we eveloped and widely disseminated advice on how to act in the case of a ransomware infection, to fully inform victims about the best and most responsible course of action.
Defence Human Capability Science & Technology Centre BAE Systems project (2016). Programme “Understanding the Moral Component of Conflict” (with E. Cartwright). Amount awarded: £13,000.
In this project we evaluated the moral decision of a government to pay a hostage ransom in a conflict environment.
Debate on issues of general foreign and military policy is all too often framed as a choice between morality and strategy. Consider, for instance, debate on the no-concessions policy regarding terrorist ransom demands or that on drone strikes in Syria. Those critical of such policies almost always claim the moral high ground. Proponents, by contrast, focus on strategic arguments. It can come as no surprise that debates played out in this way suggest a clash between morality and strategy. This can be damaging in terms of legitimizing actions within society, the morale of military personnel, and our ability to deter those vulnerable to extremism. But, do we really have to choose between strategy and morality?
In the project we proposed a novel framework with which to evaluate military policy. Two dimensions along which a policy can be judged are distinguished – strategic and moral. On the strategic dimension the focus is on whether a policy will meet the desired objectives. For instance, will a threat to not pay ransom demands be seen as credible by terrorists. This is a judgement that can be informed by game theory. On the moral dimension the focus is on whether a utilitarian perspective of ‘save lives’ or a deontological perspective of ‘do no harm’ is adopted. This is a judgement that can be informed by evidence of choice in moral dilemmas.
A utilitarian perspective, by its very nature, coincides with the optimal strategy. If, therefore, we adopt a utilitarian perspective there is no trade-off between strategy and morality. This means that any apparent conflict between strategy and morality must result from the adoption of a deontological perspective. And it is clear that those criticizing policy on moral grounds typically adopt a very strong deontological perspective considering any action that will cause harm immoral. The conflict is not, therefore, between strategy and morality but between differing moral perspectives.
Extensive evidence from studies on moral dilemmas have shown that a utilitarian perspective is favored by most people. Some, who take a more abstract view of morality, would argue that this does not justify a claim that the utilitarian perspective is the ‘right one’. In terms of justifying military policy, however, it is surely telling that the majority of people take the utilitarian perspective. This suggests that the morality of a policy can and should be judged on whether the policy achieves strategic objectives.
The common framing of debate on military policy as a choice between morality and strategy is misleading and deeply flawed. In this project we demonstrate that strategy and morality can be viewed as complements rather than substitutes; one does not have to choose between them.