Learning from Authoritative Security Experiment Results

The 2017 LASER Workshop

Measuring the Success of Context-Aware Security Behaviour Surveys

Ingolf Becker, Simon Parkin and M. Angela Sasse

University College London

Abstract

Background: We reflect on a scenario-based survey methodology that evolved through deployment in two large partner organisations (A & B). In each organisation scenarios are grounded in real, relatable workplace tensions between security and employee’s productive tasks — as during use of specific security controls — rather than using established but generic questionnaires. Survey responses allow clustering of participants according to pre-defined groups.

Aim: We aim to establish the usefulness of questions about actual controls and problems experienced by employees by assessing the validity of the clustering. We introduce measures for the appropriateness of the survey scenarios for each organisation and the quality of candidate answer options. We use these scores to articulate the methodological improvements between the two surveys.

Method: We develop a methodology to verify the clustering of participants by coding 516 (A) and 195 (B) free-text responses by two annotators by coding free-text responses. Inter-annotator metrics are adopted to identify agreement. Further, we analyse 5196 (A) and 1824 (B) appropriateness and severity scores to measure the appropriateness and quality of the questions.

Results: Participants rank questions in B as more appropriate than in A, although the variations in the seriousness of the answer options available to participants is higher in B than in A. We find that the respondents in B are more likely to commit to their answers than in A, suggesting that the survey design has indeed improved. The annotators mostly agree strongly on their codings with Krippendorff’s a > 0:7. A number of clusterings should be questioned, although a improves for reliable questions by 0:15 from A to B.

Conclusions: To be able to draw valid conclusions from survey responses, the train of analysis needs to be verifiable. Our approach allows us to further validate the clustering of responses by utilising free-text responses. Further, we establish the relevance and appropriateness of the scenarios for individual organisations. While much prior research draws on survey instruments from prior research, it is often applied in a different context; in these cases adding metrics of appropriateness and severity to the survey design and ensure that results represent security on the ground.

Important Dates

04/18 Call for Papers
07/15 Submissions Due
09/01 Authors Notified
09/11 Registration Open
Accepting Student Grant Apps
09/15 Program Announced
09/29 Student Grant Application Deadline
09/22 Hotel reservation deadline
09/29 Pre-workshop papers due
*** EXTENDED ONE WEEK
10/07 Early Bird Registration Closes
*** EXTENDED ONE WEEK
10/18-10/19 Workshop
11/22 Final Papers Due

Important Links

2017 Proceedings

LASER Workshop Home

Past Workshops

LASER Mailing List

Further Information

If you have questions or comments about LASER, or if you would like additional information about the workshop, contact us at: info@laser-workshop.org.

Join the LASER mailing list to stay informed of LASER news.