Data-driven Insights from Vulnerability Discovery Metrics International Workshop on Data-Driven Decisions, Experimentation and Evolution (DDrEE) May 27, 2019 Nuthan Munaiah Andrew Meneely PhD Candidate Associate Professor
[email protected] [email protected] Department of Software Engineering Rochester Institute of Technology Rochester, NY Motivation | Security matters. | “… payouts for … exploits range from $5,000 to $1,500,000 …” [1] | Exploits as (cyber)weapons [2] | Developers must defend against innovative attacks | Security as an integral part of the development lifecycle | Leverage processes, tools, and techniques | Inculcate an attacker mindset ZERODIUM – How to Sell Your 0Day Exploit to ZERODIUM [1] S. Collins and S. McCombie. 2012. Stuxnet: The Emergence of a New Cyber Weapon and its Implications. Journal of Policing, Intelligence and Counter Terrorism [2] Motivation 3 | Metrics empirically validated using historical vulnerabilities | Metrics can ... | ... help discover vulnerabilities | ... reveal engineering failures that may have led to vulnerabilities | Numerous metrics exist [1] but their use has been limited [2] | Challenges: Granularity, effectiveness, actionability, and usability Morrison, P., Moye, D., Pandita, R., & Williams, L. 2018. Mapping the field of software life cycle security metrics. Information and Software Technology [1] Morrison, P., Herzig, K., Murphy, B., & Williams, L. (2015). Challenges with Applying Vulnerability Prediction Models. Symposium and Bootcamp on the Science of Security [2] Motivation 4 Project Vulnerable Model Metrics Measurements Neutral Motivation 5 | Interpretation | What are the metrics telling us? | What should we ask developers to do? | Example | Dependency [1]: Why does #include<foo.h> make foo.c vulnerable? | Churn [2]: Why does high churn make foo.c vulnerable? | Metrics as more than mere explanatory variables in a model | Metrics as agents of feedback Neuhaus et al.