When I introduced the topic of Developer Centred Security to our Research Institute for the Science of Cyber Security (RISCS) community at the end of last year, I began by regaling a fascinating talk that I had recently heard at the O’Reilly Security Conference in Amsterdam: “My Heart Depends on Your Code” by Marie Moe (@MarieGMoe).
Marie, a security researcher based at SINTEF in Norway, had talked about her personal experiences of having a pacemaker fitted, and her research to fully understand its cyber security vulnerabilities. It gave a unique insight into the responsibility that software developers have for making things like pacemakers secure. It was also a reminder of the crucial sociotechnical aspects, such as understanding all the users involved in the system (the patient, doctors, developers, pacemaker vendors etc.), and how they interact with both the technology and each other.
As more and more people's lives and well-being depend on code, and by extension the people creating that code, we need to have confidence that it's secure enough. As Professors Matthew Green and Matthew Smith point out in their paper 'Developers Are Not The Enemy! The need for usable security APIs', current research almost entirely discounts the fact that administrators and software developers also make mistakes, and need help just as much (if not more) as end-users.
Critically, mistakes made by administrators or developers endanger all those who rely on their work. On the flip side, a developer who is good at what they do and understands the potential risks and benefits the implementation of their code brings to the business, can often provide the best form of assurance an organisation can have - assurance that's built-in from the outset (or secure by default).
Despite the stereotype, there is growing evidence that developers want to consider security in their day-to-day work, but there are still barriers that hinder them. The NCSC want to discover what these barriers are, so that we (and industry) can take the right steps to make cyber security decision making easier for developers.
Funding for research - a chance to collaborate
The NCSC is providing £0.5m of funding for academic research in the RISCS community to find out what these hindrances are, what currently works well and what can be done to support developers better. The EPSRC is also providing £1m of research funding to a project led by Professor Awais Rashid involving an inter-disciplinary team at Lancaster University, Open University and the University of Exeter who are part of the RISCS community. The project aims to understand the security implications of the behaviours and practices of developers and develop effective support for secure software development.
These two pieces of joined-up research will have much more effect if they can draw on support from industry partners. CyberInvest enables academia and industry to collaborate to develop pragmatic, real-world approaches.
Understanding the problem
At a workshop we held in November with academia, government and industry colleagues, we discussed the following topics to better understand the developer community and profession:
- Awareness, skills and knowledge
- Improving tools to help developers
- Providing better support for developers
- Motivating developers
I’ll explore some of the key points that arose from these discussions below.
Awareness, skills and knowledge
The number of developers has increased significantly over the last 5-10 years. The barrier to entry has reduced and we now see a significant number of enthusiastic amateur developers building apps and websites in their spare time. This diversity is fantastic, but there is a shortage of developers arriving into the profession with cyber security expertise. We’d like to understand what the current developer demographic looks like, the variety and nature of competing demands on their time and the current levels of security awareness.
We'd also like to discover how they are developing their skills. Not just the training and professional development initiatives available, but day-to-day things like the conversations they have in the workplace, how they learn most effectively, and where they turn to for information? How does security become routine? Knowing this will help us identify the optimum intervention points for awareness initiatives, education and knowledge sharing. It will also tell us the format to adopt in order for messages to sink in and stick.
Improving tools to help developers
We all choose tools to help us based on a small set of criteria: functionality, usability and cost. Not necessarily 'how secure they are', or 'how they help to make things secure'. We need tools that developers want to use and that make good security choices easier. Security and usability doesn't have to be a trade off. By exploring the characteristics of tools that developers enjoy engaging with and respond to best, we can identify how existing tools might be improved to help developers avoid security pitfalls, or how more trusted tools can become more usable.
Providing better support for developers
Developers want to write more secure code, but this might not be a priority for their organisation or customers. Getting code out quickly, albeit with vulnerabilities in that they (might) discover and fix later, may be a better fit with the company’s business model. So how do we help developers convince senior management that security is important? How can they make their managers acknowledge that technology and tool choices, culture and processes can have a significant impact on security? How do we show that outsourcing services to third parties (that they know little about) might save money in the short term, but the potential impact on the system’s cyber resilience may incur greater costs later?
By exploring the current security culture in the software development ecosystem, we can start to understand its influence on a developer’s security behaviours and decision making. Then we can evaluate what changes would impact security in a positive way so that we can make realistic and practical suggestions.
Cyber security is very rarely the key driver behind development; 'just enough’ security is always deemed sufficient. Judging what constitutes ‘enough’ is difficult and developers often have several competing pressures to manage, meaning that security often isn’t or can’t be given the time or consideration that it ideally needs.
Exploring the different reasons why developers are motivated (or demotivated) to care about various aspects of code development (things like pride and peer approval, feedback, reward, barriers to entry and gamification) will help us understand what practical changes could be introduced to motivate developers to make better cyber security decisions.
Call for research
The call for this research has just been issued to our RISCS academic partners – we’ll be choosing the successful projects by the end of March. We’d really like to see both the pot of money and support for this research topic increased by drawing on the experience of our industry partners, so that more work on this important problem can be carried out, sooner. CyberInvest provides the best way of doing this for companies of all sizes, more details about this collaborative scheme can be found here.