Tr us tw or th in es s Fa irn es s Su pe rv is io n Tr an sp ar en cy Sc al ab ili ty U sa bi lit y Securing Electronic Exam Environments Securing Electronic Exam Environments Threats, mitigations, and design principles Master’s thesis in Computer Science and Engineering DANIEL CRONQVIST SAGA KORTESAARI Department of Computer Science and Engineering CHALMERS UNIVERSITY OF TECHNOLOGY UNIVERSITY OF GOTHENBURG Gothenburg, Sweden 2023 Master’s thesis 2023 Securing Electronic Exam Environments Threats, mitigations, and design principles DANIEL CRONQVIST SAGA KORTESAARI Department of Computer Science and Engineering Chalmers University of Technology University of Gothenburg Gothenburg, Sweden 2023 Securing Electronic Exam Environments Threats, mitigations, and design principles DANIEL CRONQVIST SAGA KORTESAARI © DANIEL CRONQVIST & SAGA KORTESAARI, 2023. Supervisor: Mohammad M. Ahmadpanah, Department of Computer Science and Engineering Examiner: Andreas Abel, Department of Computer Science and Engineering Master’s Thesis 2023 Department of Computer Science and Engineering Chalmers University of Technology and University of Gothenburg SE-412 96 Gothenburg Telephone +46 31 772 1000 Cover: Illustration of the six design principle pillars proposed in this thesis. Typeset in LATEX Gothenburg, Sweden 2023 iv Securing Electronic Exam Environments Threats, mitigations, and design principles DANIEL CRONQVIST SAGA KORTESAARI Department of Computer Science and Engineering Chalmers University of Technology and University of Gothenburg Abstract Electronic exams have gained widespread popularity due to their convenience and advantages, particularly in courses involving writing or programming assessments. However, along with their benefits, electronic exams also pose the risk of facilitating cheating, especially when examinees are allowed to use their own devices. To ensure that in-hall bring-your-own-device (BYOD) electronic exams are as secure as their traditional paper-based counterparts, significant measures must be taken to secure the exam environment. This study focuses on two types of e-exam environments: software-based and OS-based. The thesis presents a comprehensive threat modeling process using the Quantita- tive Threat Modeling Method (QTMM) to identify various cheating-related threats. Based on these findings, the research proposes eight new design principles to guide developers in creating robust and secure e-exam environments as part of their design strategy. These principles are then evaluated through a case study conducted on a popular e-exam environment, Safe Exam Browser (SEB). The case study reveals several vulnerabilities and successful attacks, highlighting that six out of the eight proposed design principles were not adhered to. To address this problem, the thesis presents a novel design proposal for Safe Exam Browser that aligns with the sug- gested design principles. Implementation of this proposal would effectively address many of the preventable threats, including a significant design flaw. Lastly, the thesis explores how well both software-based and OS-based e-exam envi- ronments can mitigate threats by following these design principles. By emphasizing the importance of robust security measures in e-exam environments and providing practical recommendations, this research contributes to the ongoing efforts to en- hance the integrity of electronic examinations. Keywords: Security, threat modeling, electronic exams, vulnerabilities, design prin- ciples. v Acknowledgements We would like to thank our supervisor Mohammad M. Ahmadpanah for providing valuable feedback throughout the whole project. It has truly been great working with you. Thanks to Arne Linde and Emilio Suarez Ardiles, who trusted us with this project and allowed us to investigate the e-exam solution used at Chalmers. Finally, we would like to thank Jonathan Carbol and Hugo Stegrell for providing feedback on the writing and content of this thesis. Daniel Cronqvist & Saga Kortesaari, Gothenburg, 2023-06-02 vii Contents List of Figures xiv List of Tables xv 1 Introduction 1 1.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 1.2 Aim . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.3 Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 1.4 Limitations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 1.5 Ethical considerations . . . . . . . . . . . . . . . . . . . . . . . . . . 4 1.6 Thesis outline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 2 Related work 7 2.1 Electronic exams . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 2.2 Security assessment of older Safe Exam Browser versions . . . . . . . 8 2.3 Security by design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 3 E-exam environments 11 3.1 Paper-based exams versus e-exams . . . . . . . . . . . . . . . . . . . 11 3.2 Software-based environments . . . . . . . . . . . . . . . . . . . . . . . 12 3.2.1 Safe Exam Browser . . . . . . . . . . . . . . . . . . . . . . . . 12 3.2.2 Digiexam . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 3.2.3 The FLEX framework . . . . . . . . . . . . . . . . . . . . . . 13 3.3 OS-based environments . . . . . . . . . . . . . . . . . . . . . . . . . . 14 3.3.1 ExamOS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 3.3.2 Australian national e-exam project . . . . . . . . . . . . . . . 15 3.4 Logging . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 3.5 Environment comparison . . . . . . . . . . . . . . . . . . . . . . . . . 17 4 Safe Exam Browser (SEB) 19 4.1 Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 4.2 SEB configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 4.2.1 Third-party applications . . . . . . . . . . . . . . . . . . . . . 22 4.3 The SEB URL schemes . . . . . . . . . . . . . . . . . . . . . . . . . . 23 4.4 Learning Management System (LMS) . . . . . . . . . . . . . . . . . . 23 4.5 Network traffic . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 ix Contents 4.6 Logging . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 4.7 Safe Exam Browser Server (SEB Server) . . . . . . . . . . . . . . . . 27 5 Threat modeling 29 5.1 Quantitative threat modeling method . . . . . . . . . . . . . . . . . . 29 5.1.1 Components . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 5.1.2 STRIDE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 5.1.3 Component attack trees . . . . . . . . . . . . . . . . . . . . . 30 5.2 Threat severity scale . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 5.3 Attack tree completeness . . . . . . . . . . . . . . . . . . . . . . . . . 32 5.4 E-exam cheating threats . . . . . . . . . . . . . . . . . . . . . . . . . 33 5.4.1 E-exam environment . . . . . . . . . . . . . . . . . . . . . . . 33 5.4.2 Learning Management System . . . . . . . . . . . . . . . . . . 38 5.4.3 Examinee . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38 5.4.4 Threat severity . . . . . . . . . . . . . . . . . . . . . . . . . . 40 5.4.5 Previous findings . . . . . . . . . . . . . . . . . . . . . . . . . 46 5.5 Cheating threat relationships . . . . . . . . . . . . . . . . . . . . . . 47 6 Secure design principles 49 6.1 Trustworthiness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50 6.2 Fairness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52 6.3 Supervision . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53 6.4 Transparency . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54 6.5 Scalability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55 6.6 Usability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56 6.7 Threat mitigation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56 7 Security analysis of Safe Exam Browser: A case study 61 7.1 Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61 7.1.1 Safe Exam Browser . . . . . . . . . . . . . . . . . . . . . . . . 61 7.1.2 Inspera . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62 7.1.3 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64 7.2 Proxy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65 7.2.1 Forging SSL certificate for Inspera . . . . . . . . . . . . . . . . 66 7.2.2 Making SEB trust the Certificate Authority . . . . . . . . . . 67 7.3 Proxy injection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68 7.3.1 Ignoring local certificates and using exam networks . . . . . . 70 7.4 Brute-forcing the config file . . . . . . . . . . . . . . . . . . . . . . . 71 7.4.1 Inspera passwords . . . . . . . . . . . . . . . . . . . . . . . . . 72 7.4.2 Brute-force attack . . . . . . . . . . . . . . . . . . . . . . . . . 72 7.4.3 Improving password structure . . . . . . . . . . . . . . . . . . 73 7.4.4 Revealing the configuration file . . . . . . . . . . . . . . . . . 74 7.5 Accessing Inspera exam outside of SEB . . . . . . . . . . . . . . . . . 74 7.5.1 StartUrl vulnerability . . . . . . . . . . . . . . . . . . . . . . . 74 7.5.2 Re-constructing the SEB security headers . . . . . . . . . . . 77 7.5.3 The flaws of the SEB design . . . . . . . . . . . . . . . . . . . 77 7.6 Modifying Windows registry for process lookup . . . . . . . . . . . . 78 x Contents 7.6.1 SEB process lookup . . . . . . . . . . . . . . . . . . . . . . . . 78 7.6.2 Allowing arbitrary process to be run in SEB . . . . . . . . . . 80 7.6.3 Launching multiple processes in SEB . . . . . . . . . . . . . . 81 7.6.4 Adding executable signatures . . . . . . . . . . . . . . . . . . 82 7.7 Injecting text via USB device . . . . . . . . . . . . . . . . . . . . . . 83 7.7.1 Adding keyboard count check . . . . . . . . . . . . . . . . . . 83 7.8 SEB version enforced by Inspera . . . . . . . . . . . . . . . . . . . . . 83 7.8.1 Dictionary lookup on MacOS . . . . . . . . . . . . . . . . . . 84 7.9 Summary of attack results . . . . . . . . . . . . . . . . . . . . . . . . 84 7.10 Design principles conformity . . . . . . . . . . . . . . . . . . . . . . . 84 8 A secure design for Safe Exam Browser 87 8.1 The fundamental design flaw in SEB . . . . . . . . . . . . . . . . . . 87 8.2 SEB and examinee separation . . . . . . . . . . . . . . . . . . . . . . 88 8.3 Trusted computing . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88 8.3.1 Remote attestation . . . . . . . . . . . . . . . . . . . . . . . . 88 8.3.2 Sealed storage . . . . . . . . . . . . . . . . . . . . . . . . . . . 89 8.4 Design proposal . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89 8.4.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90 8.4.2 Extending SEB Server . . . . . . . . . . . . . . . . . . . . . . 91 8.4.3 Extending Safe Exam Browser . . . . . . . . . . . . . . . . . . 92 8.4.4 SEB and LMS communication . . . . . . . . . . . . . . . . . . 93 8.4.5 Addressing Hardware Security Module availability and com- patibility . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93 8.4.6 Design principle conformity . . . . . . . . . . . . . . . . . . . 94 8.4.7 Threat mitigation . . . . . . . . . . . . . . . . . . . . . . . . . 96 9 Assessing the optimal environment 99 9.1 Trustworthiness comparison . . . . . . . . . . . . . . . . . . . . . . . 99 9.2 Fairness comparison . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99 9.3 Supervision comparison . . . . . . . . . . . . . . . . . . . . . . . . . . 100 9.4 Transparency comparison . . . . . . . . . . . . . . . . . . . . . . . . . 101 9.5 Scalability comparison . . . . . . . . . . . . . . . . . . . . . . . . . . 101 9.6 Usability comparison . . . . . . . . . . . . . . . . . . . . . . . . . . . 102 10 Conclusion 103 10.1 Future work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104 Bibliography 107 xi Contents xii List of Figures 3.1 Overview of the Exam-tool package [3]. . . . . . . . . . . . . . . . . . 15 3.2 Overview of the OS-based e-exam environment used at Edith Cowan University in 2016. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 4.1 Safe Exam Browser (SEB) Architecture overview. . . . . . . . . . . . 20 4.2 Screenshot of the SebConfigTool for Windows. . . . . . . . . . . . . . 20 4.3 Screenshot of SebConfigTool for MacOS. . . . . . . . . . . . . . . . . 21 4.4 Code from SEB that computes the Browser Exam Key. . . . . . . . . 26 4.5 SEB Server Architecture overview. . . . . . . . . . . . . . . . . . . . 27 5.1 Example attack tree for a generic component. . . . . . . . . . . . . . 31 5.2 CAT for the spoofing STRIDE threat category of an e-exam environ- ment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35 5.3 CAT for the tampering STRIDE threat category of an e-exam envi- ronment. Note that all child nodes that appear as columns are direct children of the first appearing intermediary group node. . . . . . . . . 36 5.4 CAT for the repudiation STRIDE threat category of an e-exam envi- ronment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 5.5 CAT for the information disclosure STRIDE threat category of an e-exam environment. . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 5.6 CAT for the elevation of privilege STRIDE threat category of an e- exam environment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 5.7 CAT for the spoofing STRIDE threat category of an LMS. . . . . . . 38 5.8 CAT for the spoofing STRIDE threat category of an examinee. . . . . 39 5.9 CAT for the information disclosure STRIDE threat category of an examinee. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40 5.10 State diagram illustrating the connections between the different CATs for LMS, Examinee, and EEE. . . . . . . . . . . . . . . . . . . . . . . 47 6.1 Visual representation of the design principle pillars. . . . . . . . . . . 50 7.1 Configurable SEB options for an exam in Inspera. . . . . . . . . . . . 63 7.2 Errors generated by an examinee due to suspicious behavior, shown in the monitor tab in the Inspera administrator panel. . . . . . . . . . 63 7.3 Flow between an examinee (left) and Inspera (right) when starting an exam. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64 xiii List of Figures 7.4 Flow between Safe Exam Browser and Inspera, where the Proxy Server works in a MITM manner, having access to all traffic sent between the two parties. . . . . . . . . . . . . . . . . . . . . . . . . . 65 7.5 A simplified diagram of messages between a web browser and web server during the TLS 1.3 handshake. . . . . . . . . . . . . . . . . . . 66 7.6 Flow of the proxy injection attack. . . . . . . . . . . . . . . . . . . . 68 7.7 Example of injecting the string “Tiger team was here” into an Inspera exam. The original exam is shown at the top, whereas the modified exam is shown at the bottom. . . . . . . . . . . . . . . . . . . . . . . 69 7.8 Example of injecting Bing into a new exam tab that freely allows an examinee to search the web. . . . . . . . . . . . . . . . . . . . . . . . 70 7.9 Visiting the startUrl for the demo exam in Google Chrome. . . . . . 75 7.10 Visiting the startUrl for the demo exam in Google Chrome, after appending the User-Agent header to our requests. . . . . . . . . . . . 76 7.11 Example of modified registry entry for excel.exe to launch PowerShell instead. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 7.12 PowerShell launched inside of SEB after modifying registry. . . . . . 81 7.13 Force Touch Dictionary feature on MacOS inside of SEB version 2.3.2. 84 8.1 Overview of the installation procedure for the design proposal. . . . . 90 8.2 Overview of the procedure of providing configuration files using sealed storage. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91 xiv List of Tables 3.1 Summary of the different e-exam environments presented in Chapter 3. 17 4.1 Safe Exam Browser configuration options. . . . . . . . . . . . . . . . 22 4.2 HTTP request headers present in requests generated by SEB. . . . . 25 4.3 Possible states assigned to each SEB client via SEB server [18]. . . . . 27 5.1 The STRIDE threat modeling method [42]. . . . . . . . . . . . . . . . 30 5.2 Relevant STRIDE threat categories for each component. . . . . . . . 33 5.3 The STRIDE threat categories for the EEE component, and their corresponding contextual properties. . . . . . . . . . . . . . . . . . . 34 5.4 The STRIDE threat categories for the examinee component, and their corresponding contextual properties. . . . . . . . . . . . . . . . . . . 39 5.5 Table summarizing every threat in the attack trees presented in Chap- ter 5 along with their severity levels. . . . . . . . . . . . . . . . . . . 46 6.1 Summary of which principles help mitigate the found threats in Chap- ter 5. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59 7.1 List of passwords generated by Inspera which are used to encrypt SEB configuration files. . . . . . . . . . . . . . . . . . . . . . . . . . . 72 7.2 Summary of all attempted attacks. . . . . . . . . . . . . . . . . . . . 85 8.1 A summary of all threats and whether they are mitigated or not, with a short note describing the mitigation, or why no mitigation is shown. 98 xv xvi Glossary BYOD Bring-Your-Own-Device CA Certificate Authority CAT Component Attack Tree CVSS Common Vulnerability Scoring System EEE Electronic Exam Environment HSM Hardware Security Module LMS Learning Management System MITM Man-In-The-Middle QTMM Quantitative Threat Modeling Method RA Remote Attestation SEB Safe Exam Browser STRIDE Spoofing, Tampering, Repudiation, Information Disclosure, Denial of Service and Elevation of Privilege TC Trusted Computing TPM Trusted Platform Module VM Virtual Machine xvii 1 Introduction Electronic exams, also referred to as e-exams, are getting more and more popular. They have a wide range of benefits compared to traditional paper exams, such as automating exam correction and allowing examinees to write the exam faster. Unfortunately, these advantages also come with disadvantages; e-exams introduce many new cheating-related threats [1]. These threats are mainly dependant on three factors: where the e-exam is taken (what location), which e-exam solution is used, and whether the exam is proctored or not. As for where, there are a few possibilities: a BYOD (Bring-Your-Own-Device) in-hall e-exam, remote e-exam, or taking the exam in a dedicated computer room. The e-exam solution that is used could either be completely unrestricted and only require that examinees take the exam in a standard LMS (Learning Management System), which is a web page for hosting and writing exams. It could also be more restrictive, and force examinees to use an e-exam environment in addition to the LMS while taking the exam. Thus, when referring to an e-exam solution, it refers to the LMS and an optional e-exam environment. Lastly, whether the exam is proctored or not may differ depending on where it is taken, e.g. in-hall exams are usually monitored by an invigilator and remote exams may be recorded via a camera. In order to combat these cheating threats, the e-exam solution must implement suitable security measures. In the ideal world, an e-exam should not make it easier for an examinee to cheat in comparison to a paper-based exam. When replacing an in-hall closed book paper-based exam with a BYOD in-hall e-exam, there must be appropriate security measures taken in order to prevent the examinee from cheating. This type of exam typically requires a more restrictive e-exam environment to be more like its paper-based equivalent. The e-exam environment used can be one of two common types: a software-based e-exam environment, such as Safe Exam Browser [2] or an OS-based e-exam environment, such as ExamOS [3]. The most important point here is that both of these types of e-exam environments must be secure. Suppose such an environment would contain security vulnerabilities of any kind. In that case, this might allow a student to extend their privileges to something that would not otherwise be possible during a traditional paper exam, allowing them to cheat. The question is whether these e-exam environments could be made secure as part of their design, to avoid common vulnerabilities. Ideally, the developers creating these 1 1. Introduction kinds of environments must be sure that they have patched all of the potential vul- nerabilities, meanwhile, an attacker (an examinee) only has to find one vulnerability in order to cheat in the exam. In reality, 100% secure software is hard to accomplish. Thus, the goal of a developer is to minimize the number of vulnerabilities, rather than patch them all. 1.1 Background This thesis addresses the problem of securing e-exam environments by design. This is usually referred to as security by design [4]. Software is often analyzed both during and after implementation, resulting in vulnerabilities being found and patched. It is very common for people to find vulnerabilities in software and then report this to the company building the software, leading to the vulnerabilities being patched. However, if we consider this retrofitting approach in the case of e-exam environments, it might lead to vulnerabilities potentially never being reported in the first place. If a student knows about a vulnerability that allows them to cheat in their exams, they will most likely not report it, which means that the vulnerability will potentially never be fixed. This is why it is very important that we can ensure that these types of environments are secure, ideally in the design phase. A great example of such an approach shift is the proposal of a set of principles that should be followed when designing cryptographic protocols by Abadi and Need- ham [5]. The paper has helped cryptographers design more secure protocols, and avoid vulnerabilities that are difficult to spot during the design of protocols. An- other example is the paper by Bastys et al. which proposes a set of design principles for information flow control [6]. In a similar fashion to the papers listed here, we propose a set of design principles that help secure e-exam environments as part of their design, in order to prevent many common cheating threats. The principles are then used when evaluating the security of the e-exam environment used at Chalmers University of Technology, Safe Exam Browser [2]. Minimizing the number of vulnerabilities in e-exam environments is more compli- cated than it initially seems since one has to make tradeoffs between security and usability. It has been previously shown that both security and usability are amongst the most important aspects of an e-exam solution [7]. Focusing too much on the security aspect might make the solution harder to use, which will require e.g. more technical assistance throughout the exam. Yee [8] explains this conflict as very prob- lematic, as developers often see security and usability as each other’s complements which causes an immediate conflict during development. It is therefore important that usability and security are part of the design process, and not used as “magic pixie dust” on a finished product, as Yee states. In addition to usability and security, scalability will also be of primary concern, since it has been shown to be an important factor [1]. Scalability refers to how well the e-exam environment scales to large classes in simultaneous examination. For e-exam environments to scale well, it is important for the environment to work in a BYOD setting as the cost of providing workstations for an e-exam in large 2 1. Introduction classes might grow unreasonably large. Hietanen [3] proposes an OS-based e-exam environment booted from a USB stick, which is much cheaper than providing entire workstations. However, the results show that many examinees had trouble getting it to work because of unfamiliarity with booting an operating system from a USB stick. The solution also introduces several issues scalability-wise, mainly due to three factors: preparation, time, and cost. Each examinee needs to have their own USB stick, which means that someone has to prepare all of these (put the actual OS on it), which in turn requires time and money. The process of updating the OS on each of the USB sticks is also tedious since one most likely has to do it manually. Lastly, the amount of USB sticks that are required to conduct such exams could be hundreds, or even thousands. A software-based e-exam environment will not require any additional hardware or equipment except for the brought device, eliminating those scalability issues. Fur- thermore, students unfamiliar with booting an OS from a USB stick will not have to do so, instead, they will only be required to install an application before the exam, making the environment more usable. However, the question remains to see whether a software-based e-exam environment can have a similar level of security as an OS environment. This thesis therefore partially addresses if it is possible to achieve a similar level of security between an OS-based e-exam environment and a software- based e-exam environment while addressing usability, security, and scalability as primary concerns. 1.2 Aim This thesis focuses on proctored in-hall BYOD (Bring-Your-Own-Device) e-exams (electronic exams), and the security risks associated with those types of examinations. Therefore, when we refer to an e-exam, this means a BYOD, proctored in-hall exam. In this thesis, we answer the following research questions: RQ1: What cheating threats exist for e-exams, and how severe are these? RQ2: What is a set of common design principles for e-exam environments that, when followed, will ensure that the environment is secure and those cheating threats are mitigated? RQ3: Will a software-based e-exam environment and an OS-based e-exam envi- ronment be able to achieve a similar level of security when following the design principles? RQ4: Does the e-exam environment at Chalmers University of Technology, Safe Exam Browser, satisfy the design principles and therefore mitigate the found cheat- ing threats? 3 1. Introduction 1.3 Methodology The methodology employed in this thesis involves a comprehensive literature re- view to identify research gaps in electronic exam security. Utilizing the Quantita- tive Threat Modeling Method (QTMM) and the STRIDE framework, we analyze e-exams, by identifying cheating-related threats across its components. A severity scale is established to prioritize threats based on their impact. Design principles are proposed and evaluated through a case study on Safe Exam Browser (SEB), highlighting security flaws and suggesting a new design scheme. This methodology combines literature review, threat modeling, severity assessment, design principle formulation, and case study analysis to comprehensively investigate and secure elec- tronic exam environments, addressing vulnerabilities and enhancing security and integrity. 1.4 Limitations As mentioned in Section 1.2, the focus of this thesis is specifically BYOD (Bring- Your-Own-Device) in-hall e-exams. In-hall e-exams usually pose less of a threat than remote e-exams, since invigilators are able to continuously monitor the students throughout the exam. It has previously been shown that remote exams introduce new types of threats that can be difficult to deal with. Vegendla and Sindre [9] present some of the threats and conclude that “most cheating threats become difficult to handle when a written exam is done remotely”. An exam can be split into three specific phases: before, during and after. For electronic exams, these phases have very different characteristics, and thus very different security concerns. This thesis will be focusing on the during phase of an exam. This implies that the threats of interest are threats that an examinee can utilize during the exam. This also includes threats that are prepared before the exam but used during the actual exam, such as modifying the functionality of the e-exam environment. 1.5 Ethical considerations Since this thesis contains a security assessment done on Safe Exam Browser, it touches on the topic of ethical hacking [10]. Ethical hacking is done with non- malicious intent, where the goal is to find existing vulnerabilities and report them to the stakeholders. Nowadays, many companies welcome ethical hackers to test their systems by participating in so-called bug bounty programs, such as HackerOne1. SEB is open source, and freely available on GitHub, meaning that anyone can view the code. With open source, as the name indicates, one is welcome to contribute to the source code. SEB is licensed with the Mozilla Public License 2.02, which allows 1https://hackerone.com/ 2https://www.mozilla.org/en-US/MPL/2.0/ 4 https://hackerone.com/ https://www.mozilla.org/en-US/MPL/2.0/ 1. Introduction us to perform our penetration test without breaking any licensing rules. We have also made sure to report all of the found vulnerabilities to the developers of SEB, to make them aware of the found issues. Cheating is unethical and goes against academic integrity. The goal of the thesis is to be able to contribute to the knowledge of building secure e-exam environments, which in turn will make the process of cheating harder. Our hope is that our contri- bution will make sure that students maintain academic integrity, thus maintaining the corresponding university’s reputation. From the early stages of the thesis, we have had close interactions with the e-exam administration at Chalmers to ethically follow the steps of coordinated disclosure. We have reported our findings to them continuously throughout the work to make them aware of any security issues. Any details deemed as sensitive by the e-exam administration have intentionally been excluded from the thesis. Lastly, the information in this thesis was not shared with any other student prior to its publication. 1.6 Thesis outline The thesis is structured as follows. Chapter 2 introduces related work within the area of electronic exams and security by design to further motivate the work of this thesis. Chapter 3 examines the two types of e-exam environments that are of interest in this thesis: OS-based environments and software-based environments. Some differ- ences and similarities between the two types of environments are presented. The chapter serves as an introduction to RQ3, where we later on in the thesis use the contents of this chapter to deem whether the two types of environments can be made equally secure. Chapter 4 introduces Safe Exam Browser along with its corresponding function- ality and architecture. The chapter is meant as an introduction for the reader to understand the technicalities of SEB, which will, later on, be examined further in the case study conducted in Chapter 7. Chapter 5 examines and outlines what cheating threats exist for an e-exam envi- ronment today, related to RQ1. The threats are displayed using attack trees and classified into their corresponding severity levels. Chapter 6 introduces the design principles related to RQ2. The design principles are a central building block of this thesis and are later on used in the case study conducted in Chapter 7, to evaluate whether Safe Exam Browser can be determined secure. Chapter 7 conducts a case study on Safe Exam Browser, related to RQ4, which is the e-exam environment used at Chalmers University of Technology. The attacks tested in this chapter are initially presented in the attack trees in Chapter 5. The chapter reveals many security issues present today, along with their possible mit- igations. An underlying flaw in the design of SEB is presented, which is further 5 1. Introduction discussed along with a possible mitigation method in Chapter 8. In Chapter 9, the focus shifts towards the examination of OS-based e-exam envi- ronments and software-based e-exam environments, addressing RQ3. The chapter delves into the distinct capabilities of these two types of environments in mitigating threats, aiming to uncover their respective advantages and disadvantages. Further- more, it explores the possibility of achieving equal levels of security in both types of e-exam environments by adhering to the design principles proposed in Chapter 6. The thesis is concluded and future work is discussed in Chapter 10. 6 2 Related work This chapter outlines previous work within the area of electronic exams and security by design in order to further motivate our work in this thesis. 2.1 Electronic exams There have been a fair amount of papers examining different aspects of electronic exams, ranging from examining the security of certain solutions, to identifying how efficient they really are compared to paper-based exams. The doctoral thesis by Chirumamilla [7] studies security threats and requirements in e-exams. A comparative analysis between paper exams and e-exams is presented [1], which has been useful for us when identifying threats related to e-exams, relevant to RQ1 of the thesis. Recently, in 2021, Hietanen wrote his Master’s Thesis Security of electronic exams on students’ devices [3]. The thesis investigates how to secure BYOD e-exam en- vironments. Hietanen identifies potential threats in different e-exam environments along with possible mitigations. To counter the identified threats, he develops an OS-based e-exam environment, ExamOS, which consists of a hardened operating system with controlling software running on it. The end result shows that ExamOS is successfully able to mitigate most threats. However, the result showed that Ex- amOS was technically difficult to use for some examinees resulting in a “significant amount” of technical assistance required during the exams. This assistance, unfor- tunately, delayed the starting times of examinations as well. Our view is that the environment should be seamlessly easy to use for an examinee, and not cause delays due to examinees’ unfamiliarity with technicalities. Often, by introducing too many security measures, a system becomes difficult to operate [8]. Chirumamilla [7] com- pared multiple case studies on the topic of usage of e-exam solutions and showed that the most important aspects of an e-exam solution were usability, marking, security and reliability. Thus, usability is an important aspect, and it seems like Hietanen underestimated this fact in his thesis, or overestimated the examinees’ ability to boot an operating system from a USB stick without assistance. Our belief is that a BYOD e-exam environment becomes easier to use for an exam- inee if it is software-based, similar to Safe Exam Browser [2]. There are multiple advantages with this solution, such as it being easier to download a standalone 7 2. Related work program than booting a new OS from a USB stick, and it being more cost-efficient. Easier here means that it is likely easier for the average student to download a pro- gram and run it on their computer than to boot an operating system from a USB stick. The current downside of this environment is that it brings on new types of security-related threats since it runs on the examinee’s own laptop, with their own operating system. The point here is that both types of environments contain pros and cons. This point is further examined by us in Chapter 9, where we investigate whether both of the two environments can be made equally secure by design, which is related to RQ3 of the thesis. 2.2 Security assessment of older Safe Exam Browser versions The security of Safe Exam Browser [2] has been examined prior to this thesis through different perspectives. The doctoral thesis by Chirumamilla [7] is an example of such, where they perform a penetration test using the HARM method [11, 12] on Safe Exam Browser. Heintz [13] and Søgaard [14] also identify the security flaws of the application. The results of the three papers show that Safe Exam Browser does indeed contain security flaws. An important detail is that after all of these three penetration tests were conducted, on May 28th 2020, version 3.0.0 of SEB was released where it was completely rewrit- ten from scratch [15]. The update included several new features along with a com- pletely new embedded browser engine. We have not found similar case studies or testing on this version of SEB, and our work aims to fill that gap by performing a thorough security analysis on this new version of SEB, related to RQ4 of the thesis. The papers mentioned have served as a baseline for our case study on Safe Exam Browser presented in Chapter 7, where a few of the older established attacks have been attempted to see whether the developers have patched the flaws in newer versions. Since software is continuously developed, it is important to continuously assess its security, since newer updates may introduce new types of security flaws. It might even be that newer versions open up the possibility of exploiting old flaws that were previously patched, due to change of design in the application. 2.3 Security by design Security by design is the concept of taking security into account all the way from the design process, with the goal of designing foundationally secure applications. The concept is important due to security often being overlooked in the development pro- cess, resulting in it being de-prioritized which later on leads to vulnerabilities being found and exploited in the end. Security is first and foremost a design consideration, meaning that it isn’t something that you graft on at the end [4]. The authors of Secure by design [4] present some very good points in regard to 8 2. Related work why security is an important part of the design process. To start with, requiring developers to constantly think about security while working is not realistic since far from everyone is proficient in the area. Requiring everyone to write secure code would assume that every developer is a security expert, as well as assuming that they can think of every potential vulnerability that might occur now or in the future. Rather than always actively focusing on the security flaws when developing the software, a better approach is to shift focus onto the software design. This means that security should be incorporated into the design process, which actually introduces the benefit of non-security experts naturally writing secure code, due to the design of the system implicitly avoiding insecure constructs. Design principles When designing secure applications, it is common for devel- opers to adhere to and follow design principles that help them make more secure decisions. The paper Prudent Engineering Practice for Cryptographic Protocols by Abadi and Needham [5] is an example of such, where they present design principles that help secure cryptographic protocols. The paper made a breakthrough at the time of writing, where it helped discover many major security flaws present in cryp- tographic protocols at the time. For years, it has served as a guideline for developers designing these types of protocols, ultimately eliminating many common decisions that may lead to security flaws. According to our knowledge, no such design principles exist for e-exam environments prior to this work. Therefore, similar to the papers described above, our work presents design principles related to making e-exam environments secure by design. 9 2. Related work 10 3 E-exam environments It is very common for an e-exam solution to require an e-exam environment in order to secure an exam. The purpose of the e-exam environment is to prevent and/or detect an examinee from accessing prohibited material during an exam. In the case of BYOD in-hall exams, we believe that an e-exam environment is a must since otherwise, an examinee would have full access to their device which implies access to cheating materials. In this chapter, two different types of common e-exam environments are examined: software-based environments, and OS-based environments. There are multiple dif- ferences between these two environments, such as how they are implemented and what types of threats they can counter. Another key difference between environ- ments is whether they are open-source or not. Publishing an environment in an open-source manner opens up the possibility for other people to easier examine the environment and report any security vulnerabilities they find. Like Linus’s Law [16] states: “given enough eyeballs, all bugs are shallow”. However, the disadvantage is that malicious actors now have an easier time spotting vulnerabilities and utilizing them to cheat. This is a common challenge with open-source software, but it is generally believed that the benefits outweigh the disadvantages. The goal of the chapter is for the reader to successfully be able to identify the difference between a software-based environment versus an OS-based environment. The two types of environments are further examined in Chapter 9 of the thesis to identify whether both of them could be made equally secure, or if one of them contains flaws that cannot be solved in a BYOD e-exam setting, related to RQ3. 3.1 Paper-based exams versus e-exams While e-exams offer significant advantages over paper-based exams in certain aspects, they are limited in their ability to mitigate certain types of threats. Threats that are executed through physical means, such as bringing a paper cheat sheet [17], are impossible to prevent. Therefore, in-hall invigilators will still be required for e-exams, to help mitigate such threats. It is crucial to acknowledge these limitations of e-exam environments and incorpo- rate appropriate security measures, both as part of the environment itself and also in the physical world. While e-exams provide a more flexible and efficient alterna- 11 3. E-exam environments tive to paper-based exams, they are certainly not foolproof, and new threats and vulnerabilities may emerge over time. As a consequence of this, it is essential to consider these limitations when designing and implementing e-exam environments and to continually evaluate and improve their security measures to enhance their ability to mitigate cheating threats. 3.2 Software-based environments Software-based environments are installed as an application on an examinee’s de- vice. Some of these environments utilize so-called kiosk mode functionality, which locks down an examinee’s computer and prevents them from accessing prohibited materials during an exam. However, there exist other software-based environments that have taken a completely different approach. Instead of locking down the ex- aminee via kiosk-mode, these environments utilize extensive logging, meaning that an examinee would have full access to their device but the application instead logs everything they are doing. These two types of approaches can be regarded as active versus passive in terms of preventing cheating. The kiosk mode approach actively prevents an examinee from cheating, whereas the other approach does not prevent an examinee from cheating making it more passive. However, even if the second approach can be regarded as more passive, an examinee can still be caught cheating via auditing of the logs. In this section, we present a couple of different software-based e-exam environments, with the purpose of showing the reader what types of environments exist today. We also outline the differences between them and discuss the different challenges that need to be solved depending on how the environment is designed. 3.2.1 Safe Exam Browser Safe Exam Browser [2], SEB, is an open-source lock-down-based browser that is commonly used today. This section will present a very brief overview of SEB. SEB is further examined in Chapter 4 and Chapter 7, where we perform a case study on the environment in order to find current vulnerabilities in it. The application utilizes kiosk mode functionality that prevents a user from accessing prohibited materials during an exam. This means that SEB utilizes an active approach to prevent an examinee from cheating. SEB is configured via a configuration file, used to apply certain settings to the environment, such as what third-party software is allowed to run and what URL the browser should show the examinee during the exam. The software is examined in greater detail in Chapter 4, where the architectural overview can be found in Figure 4.1. In order to prove to an LMS that an examinee is using SEB during an exam, SEB uses three HTTP headers which are appended to all network requests. The headers are calculated using a combination of the configuration file and the executable, which is further discussed in Section 4.5. The LMS must then verify these headers to know whether an examinee is using an unmodified, legitimate version of SEB. 12 3. E-exam environments SEB logs activity to a pre-defined log file that contains information about what web pages the examinee has opened and what OS version the machine is running. The logs are not transmitted to any server, except if the exam requires an examinee to connect SEB to SEB Server [18]. The logging functionality of SEB as well as SEB Server are further described in Section 4.6 and Section 4.7. 3.2.2 Digiexam Similar to Safe Exam Browser in Subsection 3.2.1, Digiexam [19] is a software-based e-exam environment that can conduct exams in a lock-down mode. The major difference is that Digiexam is closed-source, which also implies that it has been hard for us to find material related to how Digiexam works. Digiexam is being marketed as Secure and Compliant [20]. Secure refers to the ap- plication having “advanced and secure device lockdown”, while Compliant refers to the application being “fully compliant with GDPR”. However, since the application itself is closed-source, it is hard to actually confirm that what they are claiming is true. In the case of digital exams where an examinee is required to download soft- ware to their computer, it is often beneficial for the software to be open-source so that the developers are transparent with what is running on the examinee’s machine. 3.2.3 The FLEX framework Küppers et. al. [21, 22] propose the FLEX framework intended for conducting secure e-exams. The framework includes an application that will be installed on an examinee’s system, in other words, a software-based e-exam environment. However, instead of utilizing lock-down techniques, the environment instead relies on logging for retrospective auditing of cheating behavior. This means that a student may exit the environment, but since the environment is monitored, the action will be logged to a server. This sort of approach shift is very interesting, as other solutions rely heavily on the lock-down mechanism instead. The reason why FLEX relies on logging rather than lock-down is described further in a paper by the authors [21]. According to them, it seems “nearly impossible” to implement a lock-down-based software that locks down all operating systems in the same way. The authors further argue that the logging mechanism is similar to the approach used throughout paper-based exams: you cannot prevent a student from bringing a cheat sheet into the exam hall, but an invigilator may notice this and remove the cheat sheet from them. The same goes for the approach they use for the e-exam environment, an examinee can access prohibited materials such as cheat sheets on their device, but the logging will actively spot them doing so. In order to prevent malicious modification of the application, its integrity has to be verified. The integrity verification of the FLEX application utilizes Remote Attesta- tion (RA). Remote Attestation is a part of trusted computing, which allows a verifier to verify the state of a remote client [23]. Remote attestation can be done in many ways, either purely through software or with the help of hardware security modules present in modern devices today. Not only does RA verify the integrity of the ap- 13 3. E-exam environments plication, but it also verifies that an examinee is indeed running the application in the first place, meaning that it is unfeasible to circumvent it completely. 3.3 OS-based environments Even though operating systems are classifiable as software as well, an OS-based environment is one that occupies the entirety of the operating system. This means that the operating system’s primary intention is to only facilitate enough function- ality to be able to do the exam. It is naive to assume that all examinees are able to install a second operating system on their machine by themselves, which is why environments like these are commonly distributed through bootable USB sticks. Similar to software-based environments, OS-based environments can also be viewed as lock-down environments. An OS-based environment is not locking down the device itself, but it is effectively isolating an examinee from accessing any other material on their device, by forcing them to take the exam in that specific OS. 3.3.1 ExamOS Hietanen presents the hardened operating system ExamOS (Exam Operating Sys- tem) with controlling software running on it to secure the system against cheating examinees [3]. The source code for ExamOS and all of its related parts can be found on GitHub [24], meaning that it is open-source. ExamOS was developed as part of Hietanen’s Master’s Thesis as an attempt to secure e-exams at Aalto University. ExamOS is a Linux-based operating system with a set of carefully selected hardware peripheral drivers and other OS-level functionalities. Hietanen argues that the level of control one has over a custom operating system allows him to be confident that no examinee will be able to cheat successfully. As part of ExamOS, Hietanen also developed a software called Exam-tool. Exam- tool is described as “a multi-application package that consists of background services and software that the examinee interacts with during an exam”. As part of exam- tool, a standalone Exam Browser was developed. The Exam Browser is a restrictive browser that only allows an examinee to access specific sites allowed in the exam con- figuration. The exam configuration is provided to the Exam Browser by the Exam Service, which configures the ExamOS system according to an exam’s configuration. The Exam Service fetches this configuration from a specific Exam server. Along with fetching the configuration, the Exam Service is also responsible for communi- cating all logged data to the Exam server. Such data is for example the hardware identification data to prevent multiple different examinees from authenticating to the system with the same credentials. An overview of the Exam-tool package is shown in Figure 3.1. 14 3. E-exam environments Figure 3.1: Overview of the Exam-tool package [3]. ExamOS can essentially be seen as an OS-based e-exam environment with a software- based e-exam environment running inside of it. When conducting exams with Ex- amOS, the examinees are given USB sticks that need to be plugged into their devices in order to boot up the operating system. As previously mentioned in Section 2.1, ExamOS was technically difficult to use for some examinees, who required technical assistance from invigilators, which resulted in significant delays throughout the exam process. This was most likely due to the examinees’ unfamiliarity with ExamOS. 3.3.2 Australian national e-exam project As part of the “national e-exam project” in Australia, Edith Cowan University tri- aled an open source OS-based e-exam environment at their university throughout 2016 [25]. The e-exam environment is described as an “enclosed computer-based environment that is isolated from the internet and any resources other than those provided by the lecturer”. The OS is a modified version of Ubuntu that prevents internet, Bluetooth, and local drive access, along with a custom “exam starter” that guides students to begin the exam [26]. In contrast to ExamOS described in Subsection 3.3.1, this OS-based environment 15 3. E-exam environments does not store examinees’ answers on a server. Instead, all examinees need to hand in their USB sticks after an exam, which are later used by the examiner in order to retrieve the exam responses. Figure 3.2 shows the flow of the environment. As seen in the figure, a USB duplicator is used in order to upload exam contents onto the USB sticks, as well as when retrieving the exam responses from them. Even though the USB duplicator might have made the process of copying the material onto the USB sticks slightly faster, it is mentioned that “the USBs were then manually checked to ensure all files had been copied correctly” [25]. Exam material (documents, resources..) Master USB 1. Transfer to Master USB USB duplicator2. Plug Master USB into USB duplicator 3. Examinees plug USBs into laptops USB duplicator 4. USBs retrieved from examinees 5. Collect e-exam responses Responses USB duplicator Laptop Figure 3.2: Overview of the OS-based e-exam environment used at Edith Cowan University in 2016. The environment does not utilize passive detection of any kind, such as logging. This is most likely due to the operating system being completely isolated from the internet, as described above. However, since the USB sticks are retrieved from the examinees, a possibility would have been to implement local logging that could be stored on the USB, which could be checked when the USB is retrieved from the examinee. It was reported that students were hesitant to use the e-exam environment due to distrust around booting up their machine with a different operating system along with a fear of losing data. The end result also showed that a few students had trouble navigating the OS, due to unfamiliarity with it [25]. 3.4 Logging As seen from the different environments described above, it is very common for e- exam environments to utilize logging of some kind. Logging can be seen as a passive approach towards detecting cheating: it might not actively prevent an examinee 16 3. E-exam environments from cheating, but it can notice if they are doing so. Most e-exam environments utilize a combination of lock-down and logging, but as seen from the examples above there are some environments that use only one of them. Logging of personal data on an examinee’s device may potentially infringe on their privacy. It may also be that some type of logging may be prohibited in certain areas due to privacy laws and regulations, such as General Data Protection Regulation (GDPR) [27], which is EU’s data privacy and security protection law. This may po- tentially introduce additional challenges to developers of e-exam environments since they have to keep these regulations in mind. In the case of GDPR, the developers must make sure to follow Article 17 (Right to be forgotten) [28], which states that people should have the right to request personal data to be deleted. It may also be that some types of logging may require written consent from examinees, according to laws and regulations. 3.5 Environment comparison Here we summarize the environments described in the chapter, and present their discussed features, advantages and disadvantages in Table 3.1. As introduced in Sec- tion 3.2, the terms passive and active represent the environment’s method of mit- igating examinee’s from cheating. Active environments typically implement some kind of lock-down solution to prevent cheating, whereas passive environments often rely on logging to retrospectively check if cheating was done. As the table shows, and the previous sections describe: some environments are active, some are passive, and some implement solutions that can be categorized as both. Name Type Passive Active Open-source Safe Exam Browser [2] Software Digiexam [19] Software FLEX Framework [21, 22] Software ExamOS [3] OS Australian national e-exam project [25] OS Table 3.1: Summary of the different e-exam environments presented in Chapter 3. 17 3. E-exam environments 18 4 Safe Exam Browser (SEB) Safe Exam Browser, also known as SEB, is an e-exam environment used at Chalmers University of Technology and across many other universities around the world. This chapter will dive into the technical details of SEB, such as what it does and how it works. 4.1 Architecture Citing from the SEB website [29]: “Safe Exam Browser is a web browser environment to carry out e-assessments safely. The software turns any computer temporarily into a secure work- station. It controls access to resources like system functions, other web- sites and applications and prevents unauthorized resources being used during an exam.” SEB is an open-source [30] kiosk mode application. As seen in the architecture diagram in Figure 4.1, the SEB kiosk application contains an integrated browser that displays a web page from a given URL. Additionally, one or several optional third-party application(s) can be started and run during the exam. It is up to the exam administrator to configure the URL and permit third-party applications, by defining this in the configuration file. This file is used within SEB to specify certain settings, which is further explained in Section 4.2. SEB is available across multiple platforms, such as Windows, MacOS and iOS. In this thesis, only the MacOS and Windows versions will be considered. The Windows version is written in C# and uses a Chromium-based browser engine1, and the MacOS version is written in Objective-C with a WebKit-based browser engine2. 4.2 SEB configuration The configuration file, also referred to as config file, is used to specify how SEB should function during an exam. 1https://chromium.org/ 2https://webkit.org/ 19 https://chromium.org/ https://webkit.org/ 4. Safe Exam Browser (SEB) Safe Exam Browser (SEB) Browser Kiosk Application Third-party Application Learning Management System (LMS)URL to exam Starts permitted application Figure 4.1: Safe Exam Browser (SEB) Architecture overview. The Windows version of SEB contains a configuration tool, SEBConfigTool.exe, which is used to create configuration files or configure a local client [31]. For MacOS you can access the tool by opening SEB and clicking Preferences in the menu [32]. A screenshot of the SebConfigTool.exe can be seen in Figure 4.2, whereas the MacOS-specific tool can be seen in Figure 4.3. Figure 4.2: Screenshot of the SebConfigTool for Windows. 20 4. Safe Exam Browser (SEB) Figure 4.3: Screenshot of SebConfigTool for MacOS. Both versions contain the same configuration options for SEB. However, a few con- figuration options are platform-specific, which means that MacOS-specific settings do not apply to a Windows computer, and vice versa. Even though some options are platform specific, the same config file can be used on all platforms. A few of the configurable options of SEB are shown in Table 4.1. The full list of configurable options will not be included, but the curious reader can find the full list in the SEB documentation for Windows and MacOS [31, 32]. The configuration file that is used can be encrypted using a password, by specifying the settings password shown in Table 4.1, to ensure that none of the configuration op- tions can be easily extracted prior to the exam. If the config file has been encrypted, the examinee needs to enter the password that decrypts the file upon starting an exam that is using the specific config file. SEB uses the open-source RNCryptor framework3 for encrypting config files [33]. RNCryptor is an open-source implemen- tation of AES [34] with support for many different types of usage, where symmetric encryption with a password-based key is one of them. Using PBKDF2 [35], an en- cryption key is generated from the given password and is then used to encrypt the config file. Decryption only requires that the password is known since PBKDF2 will generate the same key if the same password is entered. Therefore, if someone knows the password that was used when encrypting a config file, it is trivial to retrieve the decrypted version. The SEB documentation includes a guide on the decryption procedure [36]. The decrypted version of a SEB config file is simply an XML .seb 3https://github.com/RNCryptor/RNCryptor 21 https://github.com/RNCryptor/RNCryptor 4. Safe Exam Browser (SEB) Key Description Start URL The exam URL that SEB will show the ex- aminee throughout the exam Quit/unlock password The password that the examinee has to enter when trying to quit SEB Settings password Encryption/decryption password for the .seb config file, also known as encryption key Use Browser Exam Key and Con- fig Key Yes/no checkbox. If checked, SEB will use these keys to generate the correct headers sent in the network traffic of SEB, see Sec- tion 4.5 Allow SEB to run inside virtual machine Yes/no checkbox. If checked, SEB will be able to be run in a VM Permitted Processes Allowing processes to be run throughout the exam Prohibited Processes Prohibit processes to be run throughout the exam Table 4.1: Safe Exam Browser configuration options. file that specifies the configured settings as key-value pairs, following the Apple plist (property list) format. A small snippet of such a file is shown below. 1 2 4 5 6 originatorVersion 7 SEB_Win_2.1.1 8 startURL 9 https://examurl.com 10 - Rest of the key-value pairs - 11 12 4.2.1 Third-party applications As previously mentioned, third-party applications can be configured to be permitted to run in the config file. By allowing such an application to be run, it can be used normally within SEB, while maintaining the lock-down mode. The examinee may switch to one of the permitted applications while in SEB simply by clicking its icon in the lower bottom taskbar. These third-party applications may pose a risk to SEB depending on what kind of applications are permitted to run. For example, Microsoft Excel might be a 22 4. Safe Exam Browser (SEB) permitted third-party application during an exam. This would make it possible for an examinee to make use of the many functions that allow a user to retrieve information from outside the Excel environment. To permit a third-party application during an exam, the config file that is created must simply contain the name of the executable file that is allowed in its list of permitted processes, e.g. excel.exe. Additional configuration can be done for each third-party application such as appending certain arguments to the executable, or even allowing the examinee to select where the executable is located before starting the exam. However, according to the documentation of SEB, and after testing, it seems only Windows has support for permitted third-party applications, meaning that not all platforms are able to benefit from this feature [37]. Additionally, it is also possible to prohibit applications from being run at the same time as SEB. If a prohibited application is running when SEB starts, it will auto- matically be closed upon startup. If an application that is prohibited starts in the background when SEB is running, it will instantly be killed [31]. SEB facilitates a default list of prohibited third-party applications, which typically are applications that may help an examinee to cheat. 4.3 The SEB URL schemes SEB facilitates a custom URL scheme seb://, which makes the process of loading an exam configuration more accessible for examinees [38]. This URL scheme al- lows an LMS to provide hyperlinks such as seb://example-lms.com/config.seb which when clicked will open SEB with the configuration in the file that exists at http://example-lms.com/config.seb. The configuration is downloaded and tem- porarily loaded to be used for this SEB process but is not stored on the host. This makes it easy for examinees to start their exam in SEB with the correct configura- tion. 4.4 Learning Management System (LMS) The Learning Management System, also referred to as the LMS, is where an examinee takes the exam. As previously mentioned, the URL of the LMS is equal to the startUrl defined in the config file, as seen in Table 4.1. Popular LMS choices are Moodle4, Inspera5 and Exam.net6, among others. All of these three learning management systems may differ slightly in what they offer in terms of the exam, that is, one of them might be superior for specific types of exams such as programming exams. The important factor for all of these systems is that they all offer the option of taking an exam in a lock-down mode using SEB. Thus, SEB is a popular choice among widely used learning management systems. 4https://moodle.org/ 5https://inspera.com/ 6https://exam.net/ 23 https://moodle.org/ https://inspera.com/ https://exam.net/ 4. Safe Exam Browser (SEB) The LMS is responsible for providing a config file, see Section 4.2, so that the corresponding exam can be started securely in SEB. Typically, an examinee will click a hyperlink that uses the SEB URL scheme mentioned in Section 4.3, which then downloads the config file and opens up the exam in SEB. Furthermore, the LMS is a very important factor in how secure an e-exam will be, when taken in SEB. That is, an LMS needs to make sure to use all of the tools that SEB provides in order to verify that an exam is really taken in SEB. The SEB documentation includes a comprehensive guide [38] of how to verify that an examinee is using SEB. More specifically, the guide states the following verification steps: 1. Make sure an exam can only be taken using Safe Exam Browser. Display an error message if trying to open the quiz/exam in another browser. 2. Check if legitimate SEB settings and the correct version of SEB are used. 3. Quit SEB (and/or unlock the device) automatically after the exam was sub- mitted. 4. Facilitate starting SEB with the correct settings for the exam. 5. Don’t display any links inside an exam which would allow to navigate to other sections of the LMS or even other websites. Making sure that an exam can only be taken using SEB and checking if legiti- mate SEB settings/version is used can be done by verifying the network traffic that SEB is sending. There are three headers available that authenticate that an examinee is using SEB: User-Agent, x-safeexambrowser-configkeyhash, and x-safeexambrowser-requesthash. Details of how the headers are constructed in SEB can be found in Section 4.5. To make sure that an examinee is only able to quit SEB after an exam has been submitted, a quit/unlock password (see Table 4.1) needs to be defined in the config file. This password would then need to be communicated to an examinee once a trusted invigilator has confirmed that the examinee has submitted their exam. In order for the LMS to be able to facilitate starting SEB with the correct settings for the exam, it needs to be able to create config files (see Section 4.2), which are automatically started in SEB via the SEB URL scheme (see Section 4.3). Finally, the exams created through the LMS should not contain any links that would allow an examinee to exit from the exam environment. If the exam would contain any links that would allow an examinee to do so, this could lead to the examinee being able to access material that would otherwise be prohibited during the exam. 24 4. Safe Exam Browser (SEB) 4.5 Network traffic The network traffic that SEB generates has three particularly important HTTP request headers, in order for an LMS to authenticate that SEB is used. Example values of the three headers can be seen in Table 4.2. HTTP header Example value User-Agent Mozilla/5.0 (Windows NT 10.0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/107.0.5304.68 SEB/3.4.1 (x64) x-safeexambrowser-configkeyhash dce0ce79d87e7c98b4fa23a66bea227ad 15100f4fad7d5e9da2be2eb35157a67 x-safeexambrowser-requesthash e50720a375958db595a05525b0d3dab92 1b1859ca0788a0caf459b97811d45b4 Table 4.2: HTTP request headers present in requests generated by SEB. For the User-Agent header, SEB appends the SEB/version text. This header will always have the same value, meaning that it will not change. The only part of the header that may change is the version number, which depends on what version of SEB is currently used. The x-safeexambrowser-configkeyhash and x-safeexambrowser-requesthash headers are SHA256 hashes of the request URL appended with either the Browser Exam Key (for the requesthash) or the Config Key (for the configkeyhash). This implies that the values of these two hashes will change depending on what URL has been requested. The Browser Exam Key and Config Key are generated within SEB if the option Use Browser Exam Key and Config Key has been checked in the config file, see Ta- ble 4.1. These two keys are slightly different and can be used to verify slightly different aspects of SEB. The code snippet of calculating the Browser Exam Key, also known as BEK, is shown in Figure 4.4. The code can also be found in the KeyGenerator.cs class on GitHub7. As seen in the code snippet, the process of calculating the BEK is simply taking a SHA256 hash of the SHA1 fingerprint of the code signature for the executable, the file version, and finally the Config Key. This implies that the BEK will vary depending on what version of SEB is used. It will also differ depending on the platform that the examinee is using, meaning that the BEK on Windows will be different from the one on MacOS. This implies that the x-safeexambrowser-requesthash header is able to verify two things: the correct version of SEB and the correct configuration of SEB. 7https://github.com/SafeExamBrowser/seb-win-refactoring/blob/b69280731a212cad 699f911f98431c5d47104d7b/SafeExamBrowser.Configuration/Cryptography/KeyGenerator .cs 25 https://github.com/SafeExamBrowser/seb-win-refactoring/blob/b69280731a212cad699f911f98431c5d47104d7b/SafeExamBrowser.Configuration/Cryptography/KeyGenerator.cs https://github.com/SafeExamBrowser/seb-win-refactoring/blob/b69280731a212cad699f911f98431c5d47104d7b/SafeExamBrowser.Configuration/Cryptography/KeyGenerator.cs https://github.com/SafeExamBrowser/seb-win-refactoring/blob/b69280731a212cad699f911f98431c5d47104d7b/SafeExamBrowser.Configuration/Cryptography/KeyGenerator.cs 4. Safe Exam Browser (SEB) 1 using (var algorithm = new HMACSHA256(salt)) 2 { 3 var hash = algorithm.ComputeHash(Encoding.UTF8.GetBytes( 4 appConfig.CodeSignatureHash + 5 appConfig.ProgramBuildVersion + 6 configurationKey)); 7 var key = ToString(hash); 8 9 browserExamKey = key; 10 } Figure 4.4: Code from SEB that computes the Browser Exam Key. The Config Key, on the other hand, only depends on the currently loaded configura- tion of SEB. It is calculated by first converting the config file into SEB-JSON, which is a JSON-like representation of it [39]. All keys in the SEB-JSON representation need to appear in sorted order, some keys are left out for version independence, and no whitespace should occur in the representation. After the SEB-JSON has successfully been generated, the resulting Config Key is simply a SHA256 hash of it. Thus, the Config Key has version and platform independence, meaning that all versions and platforms of SEB will generate the same key. LMS solutions can also calculate the same hash, by performing the exact same procedure. 4.6 Logging According to the privacy statement [40] found in the SEB documentation, SEB does not collect any unnecessary personal information on an examinee’s computer. Some personal data, such as device type/name, OS version, computer username, and URLs of opened web pages can be found in the log files that SEB saves on the system. By default, these log files are not transmitted to any other server, they are simply used for debugging purposes. If anything goes wrong during the startup of SEB, error messages can be found in these files which can help a user debug the underlying issue. An SEB client can be configured to connect to an instance of SEB Server [18], where the previously mentioned log files will be sent. However, this option is considered as an add-on for SEB, and needs to be specifically configured. SEB will not transmit the log files by default to any other server. The server needs to be set up by the institution in order for the examinees to be able to connect to it. The functionality of SEB Server is further described in Section 4.7. 26 4. Safe Exam Browser (SEB) 4.7 Safe Exam Browser Server (SEB Server) The Safe Exam Browser Server, SEB Server, is a web application with the purpose of simplifying and centralizing the configuration of SEB clients for exams. The application interacts with an LMS in order to set up and configure exams. According to the documentation [18], SEB server “improves security”, by allowing SEB clients to be monitored in real-time during an exam. An architectural overview of SEB Server can be seen in Figure 4.5, where it is shown that the server communicates with the SEB client(s) and the LMS. SEB Client SEB Server LMS  (Learning Management System) Figure 4.5: SEB Server Architecture overview. As mentioned, SEB clients can be monitored via the SEB Server during exams. The information available about a client is the same information collected in the log files described in Section 4.6. SEB Server also pings the clients continuously and assigns each client a state. The different states can be one of Connection Requested, Active, Missing, Closed, or Canceled. The details of each state are further described in Table 4.3. State Description Connection Requested Appears whenever a SEB client has contacted the SEB server, but not yet finished the initial connec- tion handshake or logged into the LMS. Active Appears after a successful handshake and login into the LMS. Stays as long as the connection is connected, implying that it is not closed or termi- nated. Missing Appears when a SEB client is active, but has miss- ing ping(s). Closed Appears whenever a SEB client closes connection, after being active. Canceled Appears whenever a connection has been canceled. Table 4.3: Possible states assigned to each SEB client via SEB server [18]. 27 4. Safe Exam Browser (SEB) In order to configure a SEB client to connect to SEB Server, a specific connection configuration file has to be provided to the client. Similar to the configuration file mentioned in Section 4.2, the connection configuration file can be encrypted with a password, which the examinee needs to enter upon startup of SEB. 28 5 Threat modeling Threat modeling [41] is the process of identifying the underlying threats of a system in order to gain a valuable understanding of how to mitigate these threats. Once the threats have been identified, one can then establish security requirements that need to be followed in order to mitigate the threats in question. The process of threat modeling is important to take into consideration while designing systems and applications. Not only does it help you to find security issues early, but it also helps you to understand your system in-depth and its security requirements [42]. This chapter aims to identify cheating-related threats in e-exams. An important detail to keep in mind is that this thesis focuses on threats that an attacker may use during an actively invigilated exam. This means that some types of threats are out of scope, such as those only relevant after the exam has finished (i.e. altering your answers once the exam has finished). However, threats that are prepared before the exam takes place are still relevant, since they pose a threat throughout the exam (i.e. modifying an e-exam environment’s code before the exam begins, in order to gain an advantage). In order to identify such threats, we utilize the “Quantitative Threat Modeling Method”, which is described in detail in Section 5.1. The identified threats are later used when proposing our design principles in Chapter 6, as well as when assessing the security of Safe Exam Browser in Chapter 7. 5.1 Quantitative threat modeling method The Quantitative Threat Modeling Method, also known as “quantitative TMM” or “QTMM”, is a threat modeling method proposed by Potteiger et al [43]. The method involves defining components within the system or procedure, thereafter, threats are discovered for the corresponding components. Two approaches are used in order to discover the threats: STRIDE and Component Attack Trees (CATs), which will be further described below. At last, the threats are classified into different severity levels using the Common Vulnerability Scoring System [44], usually referred to as CVSS. An alternative could be to use DREAD [45] for the severity scoring, which is a similar quantitative scoring system. However, as suggested by QTMM, we will use a scoring system similar to CVSS, although with slight modifications due to CVSS containing categories that are non-applicable to the setting. This scoring system will be further described in Section 5.2. 29 5. Threat modeling 5.1.1 Components In order to define the components of a system or procedure, one must divide the system or procedure into different parts. This involves identifying which discrete parts a system has, and how they interact with each other. As different systems or procedures vary in complexity and scale, their corresponding destruction into components will also vary. Potteiger et al. show a great example of how an example rail-way system can be divided into components [43]. 5.1.2 STRIDE STRIDE is an acronym that stands for Spoofing, Tampering, Repudiation, Infor- mation Disclosure, Denial of Service, and Elevation of Privilege. The method was first invented by Kohnfelder and Garg at Microsoft [46], where the overall goal of the method was to help developers identify potential attacks in order to design more secure systems. STRIDE is described in more detail in Table 5.1. Threat Category Property violated Definition S Spoofing Authentication Impersonating some- thing or someone T Tampering Integrity Modifying data R Repudiation Non-repudiation Claiming that you did not do something I Information Disclosure Confidentiality Providing information to a non-authorized party D Denial of Service Availability Making service(s) un- available E Elevation of Privilege Authorization Allowing someone to perform actions not normally permitted Table 5.1: The STRIDE threat modeling method [42]. 5.1.3 Component attack trees Component attack trees, CATs, are constructed for each of the six STRIDE threat categories. In other words, each component will have six different attack trees, each of them illustrating different threat categories. An example of such an attack tree is shown in Figure 5.1. There are a few different nodes present in this tree: the root node, intermediary nodes, leaf nodes, and sub-tree nodes. The root node describes the corresponding STRIDE threat category and the component’s name. Intermediary nodes and leaf nodes are slightly different - leaf nodes contain the direct threat whereas the intermediary nodes describe the threat group. The threat group is essentially seen as a node grouping together two or more threats, meaning that these threats are all attack vectors for the same threat group. The sub-tree node can be seen as a cross-reference to a different attack tree [47]. Successfully attacking 30 5. Threat modeling the root node in the referenced attack tree will allow an attacker to further advance their attack in the referring tree. STRIDE Threat Category (Component) Threat group Threat 1 Threat 2 Sub-tree Figure 5.1: Example attack tree for a generic component. An attacker starts at a leaf node and thereafter iterates upwards through an inter- mediary node path all the way to the root. When the attacker reaches the root, they have successfully breached the system via the threat category defined in the root node. An important detail is that each of the tree branches portrays logical OR relationships, meaning that either of the branches may be fulfilled in order to successfully fulfill the root node goal. In the following sections, all attack trees that are presented follow this pattern. 5.2 Threat severity scale As mentioned at the beginning of the chapter, the QTMM method uses CVSS in order to classify the threats into different severity levels. However, some CVSS met- rics are either out of scope or simply not applicable to electronic exams. Therefore, the threats in this chapter will be classified according to more suitable metrics. The two CVSS metrics attack complexity and user interaction will be used. As defined in CVSS, the user interaction metric refers to a user other than the attacker, needing to participate in the attack for it to be successful [44]. For electronic exams, the attacker is the user, the cheating examinee. Therefore, the user interaction metric as defined by CVSS needs to be re-defined slightly. User interaction, in this case, refers to how much the examinee (the attacker) needs to interact with their computer in an anomalous fashion during the exam in order to perform the cheating attack. Anomalous interaction is defined as interaction that deviates from regular exam writing, which usually has a higher chance of being noticed by an invigilator. To make up for this re-definition, another metric called third-party interaction will be defined, which has the same meaning as user interaction does in CVSS. Thus, third-party interaction refers to the interaction from another person helping the examinee. In addition to these three metrics, the threat’s portability will be considered. The portability metric has earlier been introduced by Hietanen [3], during his work of identifying potential cheating threats. This metric measures the ease of porting an 31 5. Threat modeling exploit. Portability is considered to be high if it is possible to cheat with an obtained exploit without any manual work, such as making changes to the exploit. A threat will be considered severe if the following holds: • The attack complexity is low • The user interaction is low • The third-party interaction is low • The portability is high 5.3 Attack tree completeness The biggest difficulty when constructing attack trees is knowing when a tree is complete. The term complete here refers to whether or not a tree includes the largest possible amount of attack vectors. After all, in order to be able to protect a system from a specific attack, you must first realize that this type of attack is a threat in the first place. This is one of the more difficult factors when threat modeling, since how secure the system will be in the end largely depends on what threats you find throughout the modeling process. It would also be naive to assume that one is able to include all such threats and therefore be able to mitigate them all when designing a system. Another valid point is that there might exist new types of threats in the future, which are even harder to take into account when threat modeling today. Sonderen [48] presents a “manual for attack trees” which describes the process of constructing attack trees for a system. He outlines two phases: the discovery phase and the detailing phase. The goal of the discovery phase is to discover as many different attacks as possible, which should result in a tree that contains at least the root and one layer of attacks. This can be done using various approaches, one of them being brainstorming. After the discovery phase is done, the detailing phase begins where the overall goal is to detail all the discovered attacks. According to Sonderen, when all the attacks are detailed enough, this results in a complete attack tree. Our main concern here is that the resulting trees from these two phases may differ depending on what actors are constructing them. Since the first phase includes approaches such as brainstorming, this suggests an approach similar to “try to think about every single possible attack vector within the system”. This might be fairly easy for someone proficient in security but is most definitely harder for people with less experience. Thus, the initial question still remains: when is the tree complete, meaning that one has included the largest possible amount of attack vectors? A large effort has been put into making the trees presented in this chapter as com- plete as possible, by including all possible threats that we could find. As discussed earlier in the chapter, this has been done both via brainstorming but also by reading previous work within the area. By choosing the QTMM threat modeling method and making use of STRIDE, we have managed to construct trees that are relevant to every threat category. 32 5. Threat modeling 5.4 E-exam cheating threats Using the threat modeling method QTMM described in Section 5.1, we have suc- cessfully been able to identify relevant threats for e-exams. For an e-exam, there exist three main components: the LMS, the e-exam environ- ment (also known as EEE), and the examinee. In order to construct CATs for each component, we first had to identify the relevant STRIDE threat categories, which can be seen in Table 5.2. Some threat categories do not apply to specific components, and some are simply out of this thesis’ scope. Most of the threat categories are out of scope for the LMS component, since the focus of this thesis is not directly the LMS. Threats that directly affect the e-exam environment via the LMS are still of interest, since they may impact the security of the environment. For all three components, D (Denial of Service) is either out of scope or not applicable. Even though denial of service is a valid threat against the LMS, it is out of the scope of our work since we are focusing on securing e-exam environments. There are some cheating-related threats that could be considered relevant to the denial of service category, such as spamming network traffic to a critical component of the EEE. However we will not be taking these types of threats into account since these often require mitigations that rely on external solutions, such as denial of service protection. E-exam environment (EEE) LMS Examinee S T Out of scope Not applicable R Out of scope Not applicable I Out of scope D Not applicable Out of scope Not applicable E Out of scope Table 5.2: Relevant STRIDE threat categories for each component. 5.4.1 E-exam environment As seen in Table 5.2, all threat categories except for D are relevant for the e-exam environment (EEE) component. In order to fully understand the relevant CATs for these threat categories, we must first define what the threat categories mean in the EEE context. Table 5.3 describes the relevant threat categories of an EEE in more detail. The EEE property specifically defines what an EEE must successfully do in order to prevent the threats in the corresponding STRIDE threat category. For spoofing, the underlying EEE property states that the EEE authenticates correct device and software usage. Using rather broad terms, correct device and software usage refers to using the correct, unmodified EEE software on a device that hasn’t been specifically modified to give the examinee an advantage. If an examinee manages to circumvent this, this means that they have successfully managed to spoof their EEE. Similarly, for information disclosure, confidential information refers to information that would give 33 5. Threat modeling the examinee an advantage if they knew about it. Lastly, in elevation of privilege, an authorized action is one that is in line with the aim of the EEE. For a lock-down EEE, the aim is to prevent the examinee from accessing anything outside of the EEE. Any action that results in the examinee being able to access anything outside of the lock-down EEE is, therefore, an unauthorized action, and successfully elevates an examinee’s privileges. EEE property Successful property violation S EEE authenticates correct device and software usage An examinee is successfully able to use the device and/or software incor- rectly T EEE checks EEE integrity An examinee is successfully able to alter the code and/or functionality of the EEE R EEE makes sure that any cheating- related activity is logged via logging An examinee is successfully able to circumvent logging of certain activ- ity I EEE makes sure that an examinee is not able to obtain confidential infor- mation related to the EEE An examinee is successfully able to obtain confidential information about the EEE E EEE makes sure that every action performed by an examinee is autho- rized An examinee is successfully able to perform an unauthorized action giv- ing them an advantage Table 5.3: The STRIDE threat categories for the EEE component, and their corre- sponding contextual properties. EEE Spoofing The CAT for spoofing an EEE is shown in Figure 5.2. The tree contains various threats such as running the EEE inside of a virtual machine (ES4), injecting data into the EEE via a proxy (ES1), and remotely using the EEE (ES3). As for remotely using the EEE via third-party remote control software such as TeamViewer1 or Zoom2 (ES3), an examinee would have easy access to material outside of the EEE, while still being able to control the EEE via the application. This would also require the examinee to run the EEE via a remote computer, and then connect to the remote computer via another computer in the exam hall. Using the EEE outside of the exam hall (ES2) could also be a possibility, which means that the examinee would not even be present in the hall to begin with. Another possibility of spoofing would be to use another Wi-Fi network (ES5) – even though this doesn’t necessarily mean that an examinee is able to access prohibited materials directly (in the case of a lock-down EEE), it still falls under the spoofing category since the examinee is not using their device correctly. The threat could also be combined with other threats: for instance, an examinee might have to use another Wi-Fi network in the first place in order to even be able to access prohibited material since the dedicated exam network might deny access to prohibited websites. Lastly, 1https://teamviewer.com/en/ 2https://zoom.us/ 34 https://teamviewer.com/en/ https://zoom.us/ 5. Threat modeling tampering with the EEE in any way means that one is using the EEE incorrectly, which in turn means that the examinee is using a spoofed EEE. Therefore, the CAT includes a reference to the tampering sub-tree which can be seen in Figure 5.3. Spoofing (E-exam environment) ES1: MITM proxy injecting data into EEE ES2: Using EEE outside of exam hall Using modified EEE EEE Tamper- ing (5.3) ES3: Using EEE via third-party remote control software ES4: Running EEE in VM ES5: Using another Wi-Fi network Figure 5.2: CAT for the spoofing STRIDE threat category of an e-exam environment. EEE Tampering The tampering CAT for an EEE, which can be seen in Fig- ure 5.3, includes several threats that an examinee could make use of in order to successfully tamper with the EEE. The tree contains three major categories that could be tampered with in order to alter the functionality of the EEE: the software, the operating system (OS), and the hardware. Some threats within these three cat- egories are very specific for certain types of EEEs, and may not be applicable to all of them. For example, ET9 (redirect allowed third-party processes) will only be a valid threat against EEEs that specifically allow third-party processes to be run in the first place. EEE Repudiation In the case of an EEE, repudiation is about successfully being able to circumvent logging of a certain activity, as mentioned in Table 5.3. The CAT related to EEE repudiation is shown in Figure 5.4. There exist three ways for an examinee to circumvent logging: by disabling it completely, by preventing the logging from actually happening, or by faking the logging. Disabling or preventing logging is done by tampering with the EEE. The type of logging that an examinee may wish to circumvent for cheating purposes is also solely dependent on what the EEE actually is logging. For an examinee to successfully be able to cheat, it might be sufficient to only remove part of the logging, meaning that removing all logging may not be necessary. It may also be more suspicious if an examinee disables logging altogether since that means that the EEE would not be logging anything at all. Another approach is for the examinee to fake the logging (ER1), meaning that they are trying to remove the traces of their cheating or simply exchanging the suspicious logs with more acceptable ones. 35 5. Threat modeling Tampering (E-exam environment) ET1: Tampering with e-exam configuration Tampering with software ET2: Removing integrity check ET3: Removing VM check ET4: Allowing third-party processes ET5: Preventing logging ET6: Allowing USB-key injection Tampering with OS ET7: Blocking syscalls ET8: Modifying firewall ET9: Redirect allowed third-party processes ET10: Schedule process to start at specific time ET11: Modifying DNS cache to route requests to MITM proxy and modifying OS trusted root certificates store to trust forged certificates ET12: Abbreviation expansion (OSX) ET13: OS-level scripting (OSX Automator) Tampering with hardware ET14: Two comput- ers in one ET15: USB device injection ET16: Chipset- related hardware attacks [3] Figure 5.3: CAT for the tampering STRIDE threat category of an e-exam environ- ment. Note that all child nodes that appear as columns are direct children of the first appearing intermediary group node. Repudiation (E-exam environment) Avoiding cheating- related logging Disabling logging in software EEE Tamper- ing (5.3) Preventing network traffic contain- ing logging EEE Tamper- ing (5.3) ER1: Faking logging Figure 5.4: CAT for the repudiation STRIDE threat category of an e-exam environ- ment. EEE Information Disclosure For information disclosure, a cheating examinee’s goal would be to obtain confidential information about the EEE, which in turn would give the examinee an advantage. The related threats are shown in the CAT in Figure 5.5. For example, for a closed-source EEE, a valid threat would be if an outside party manages to reveal parts of the source code via techniques such as reverse engineering or decompilation (EI2 and EI3). By doing this, one would gain knowledge about the internals of the software, which in turn could potentially lead to an understanding of how to break the software. Similarly, for EEEs that allow defining specific third-party processes that are permitted throughout an exam, a valid threat would be if someone finds out about these processes before the exam 36 5. Threat modeling has begun (EI1). This imposes a new threat since an examinee could alter the func- tionality of the third-party software beforehand, in order to cheat. Lastly, retrieving confidential information from the configuration file is also a valid threat, if that is applicable to the EEE. In the case of encrypted configuration files, one can do this by brute-forcing the encryption key (EI4), or simply breaking the cryptographic scheme (EI5). Information Disclosure (E-exam environment) EI1: E-exam config- uration leaks information Revealing source code of proprietary EEE software EI2: Reverse engineering EI3: Decompilation Retrieving information from e-exam configuration EI4: Brute force encryption key EI5: Breaking weak cryptographic implementation Figure 5.5: CAT for the information disclosure STRIDE threat category of an e- exam environment. EEE Elevation of Privilege The last CAT for the EEE, related to elevation of privilege, can be seen in Figure 5.6. It includes three main categories: breaking out of the EEE, assistance/collaboration, and accessing forbidden materials. In order for an attacker to accomplish either of these, they need to either spoof or tamper. The reasoning behind this is simply because the concept of privilege elevation is a very broad topic, and therefore there are numerous approaches one can take in order to successfully elevate their privileges. Elevation of Privilege (E-exam environment) Breaking out of EEE EEE Spoofing (5.2) EEE Tamper- ing (5.3) Assistance or collaboration EEE Spoofing (5.2) EEE Tamper- ing (5.3) Access to forbidden materials EEE Spoofing (5.2) EEE Tamper- ing (5.3) Figure 5.6: CAT for the elevation of privilege STRIDE threat category of an e-exam environment. 37 5. Threat modeling 5.4.2 Learning Management System For the LMS, only the spoofing threat category was found to be in scope and appli- cable. Successfully spoofing the LMS means that an examinee has successfully been granted access to an exam using an incorrect EEE. The term incorrect EEE refers to an EEE that deviates from the one that is specified to be used during an e-exam (most likely through the LMS). LMS Spoofing The spoofing CAT for the LMS can be seen in Figure 5.7. There are two main ways an examinee could gain access to an exam using an incorrect EEE, either by using a completely invalid EEE, or without using the EEE at all (LS2 and LS3). An invalid EEE could either be an older version (LS1), which deviates from the version that the examinee should use, or it could simply be a modified EEE. Modifying refers to tampering, thus, the tampering sub-tree from Figure 5.3 is included in the CAT. Spoofing (LMS) Access to e-exam in LMS using an invalid EEE LS1: Use older version of EEE (than acceptable) and use unpatched vulnerabilities Using a mod- ified EEE EEE Tamper- ing (5.3) Access to e-exam in LMS without using EEE at all LS2: Don’t use EEE LS3: Intercept checking and claim to use when not Figure 5.7: CAT for the spoofing STRIDE threat category of an LMS. 5.4.3 Examinee For the examinee component, three relevant STRIDE threat categories were iden- tified: Spoofing, Information Disclosure, and Elevation of Privilege. These three threat categories in the context of the examinee component are further described in Table 5.4. Authorized information related to the information disclosure category is defined as information that is purposefully disclosed by the examiner or any other similar authority to all examinees prior or during the e-exam. Any information that has not been purposefully disclosed by an authority to all examinees is con- sidered unauthorized. Obtaining such information, therefore, results in information disclosure. Related to the elevation of privilege category, authorized actions refers to actions that are allowed during the examination session. Some of these actions likely 38 5. Threat modeling differ significantly depending on where the examination is to take place, universities might have different rules for toilet breaks, or for asking questions. However, it is likely that rules regarding forbidden materials are at least somewhat similar, so a common unauthorized action likely belongs to the group of actions that results in t