Image source: everdrone.se Simulating Drone-Enhanced Emergency Medical Services Facilitating Situational Awareness in Crises Master’s thesis in Interaction Design and Technologies PHILIP EDBERG EMIL JONSSON Department of Computer Science and Engineering CHALMERS UNIVERSITY OF TECHNOLOGY UNIVERSITY OF GOTHENBURG Gothenburg, Sweden 2025 Master’s thesis 2025 Simulating Drone-Enhanced Emergency Medical Services Facilitating Situational Awareness in Crises PHILIP EDBERG EMIL JONSSON Department of Computer Science and Engineering Chalmers University of Technology University of Gothenburg Gothenburg, Sweden 2025 A Chalmers University of Technology Master’s thesis template for LATEX Facilitating Situational Awareness in Crises PHILIP EDBERG EMIL JONSSON © PHILIP EDBERG, EMIL JONSSON 2025. Supervisor: Josef Wideström, Department of Computer Science and Engineering Advisor: Daniel Blecher, Everdrone AB Examiner: Morten Fjeld, Department of Computer Science and Engineering Master’s Thesis 2025 Department of Computer Science and Engineering Chalmers University of Technology and University of Gothenburg SE-412 96 Gothenburg Telephone +46 31 772 1000 Typeset in LATEX Gothenburg, Sweden 2025 iv Facilitating Situational Awareness in Crises PHILIP EDBERG EMIL JONSSON Department of Computer Science and Engineering Chalmers University of Technology and University of Gothenburg Abstract This thesis explores how simulation can support the development and adoption of Drone-Enhanced Emergency Medical Services (DEMS), with a specific focus on improving early situational awareness (SA) among Helicopter Emergency Medical Services (HEMS) coordinators. In collaboration with Everdrone AB, a simulator was designed to replicate the LiveView system used in real emergencies, enabling users to engage in scenario-based training without real-world risks. The simulator includes common emergency scenarios, such as a traffic accident, a building fire, and an automated external defibrillator (AED) delivery, and was developed using principles of interaction design, telepresence, and simulation-based learning. An evaluation through interviews with stakeholders in the field highlights the simulator’s value in enhancing decision-making, SA, and preparedness. The results suggest that simulated DEMS can facilitate training, inform future system development, and contribute to safer and more effective emergency response strategies. Keywords: DEMS, simulation, situational awareness, emergency response, interaction design, telepresence v Acknowledgements We would like to thank our supervisor Josef Wideström for his guidance throughout the thesis. We also acknowledge Daniel Blecher at Everdrone AB for providing the opportunity and necessary resources for the project. Lastly, we thank Albin Tjäder, HEMS coordinator at Västra Götalandsregionen (VGR), for participating in the interview and contributing valuable insights. Josef Widström, Gothenburg, 2025-06-10 Daniel Blecher, Gothenburg, 2025-06-10 Albin Tjäder, Gothenburg, 2025-06-10 vii Contents List of Figures xiii List of Tables xv 1 Introduction 1 1.1 Research Question . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 1.2 Design Challenge . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 2 Background 3 2.1 Overview of Everdrone . . . . . . . . . . . . . . . . . . . . . . . . . . 3 2.1.1 Types of Emergencies . . . . . . . . . . . . . . . . . . . . . . . 4 2.1.2 Current Flight Procedures . . . . . . . . . . . . . . . . . . . . 5 2.2 Advantages of Simulating DEMS . . . . . . . . . . . . . . . . . . . . 6 3 Theory 9 3.1 Interaction Design Theory . . . . . . . . . . . . . . . . . . . . . . . . 9 3.1.1 Graphical Interfaces . . . . . . . . . . . . . . . . . . . . . . . 9 3.1.2 User Research . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 3.1.3 User-Centered Design (UCD) . . . . . . . . . . . . . . . . . . 10 3.1.4 Prototyping . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 3.2 Designing for Simulated Environments . . . . . . . . . . . . . . . . . 10 3.2.1 Virtual Simulations . . . . . . . . . . . . . . . . . . . . . . . . 10 3.2.2 Telepresence . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 3.3 Everdrone’s Technologies . . . . . . . . . . . . . . . . . . . . . . . . . 11 3.3.1 DEMS Mission Control . . . . . . . . . . . . . . . . . . . . . . 11 3.3.2 Mission Profiles . . . . . . . . . . . . . . . . . . . . . . . . . . 12 3.4 METHANE Framework . . . . . . . . . . . . . . . . . . . . . . . . . 15 3.4.1 M - Major Incident . . . . . . . . . . . . . . . . . . . . . . . . 15 3.4.2 E - Exact Location . . . . . . . . . . . . . . . . . . . . . . . . 15 3.4.3 T - Type of Incident . . . . . . . . . . . . . . . . . . . . . . . 16 3.4.4 H - Hazards . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 3.4.5 A - Access . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 3.4.6 N - Number of Casualties . . . . . . . . . . . . . . . . . . . . 16 3.4.7 E - Emergency Services . . . . . . . . . . . . . . . . . . . . . . 16 3.5 Theoretical Applications of DEMS . . . . . . . . . . . . . . . . . . . . 17 3.5.1 Traffic Incident . . . . . . . . . . . . . . . . . . . . . . . . . . 17 ix Contents 3.5.2 Building Fire . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 3.5.3 Forest Fire . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 3.5.4 Drowning Incident . . . . . . . . . . . . . . . . . . . . . . . . 18 3.5.5 Railway Accident . . . . . . . . . . . . . . . . . . . . . . . . . 18 3.5.6 Chemical Accident . . . . . . . . . . . . . . . . . . . . . . . . 18 3.5.7 Flooding . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 3.5.8 Landslide and Avalanche . . . . . . . . . . . . . . . . . . . . . 19 3.5.9 Oil Spill . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 3.6 3D Environment Design . . . . . . . . . . . . . . . . . . . . . . . . . 21 4 Method 23 4.1 Requirements and Stakeholder Analysis . . . . . . . . . . . . . . . . . 24 4.2 Scenario Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 4.3 Technical Exploration . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 4.4 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 4.5 Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 5 Process 29 5.1 Requirements and Stakeholder Analysis . . . . . . . . . . . . . . . . . 29 5.2 Scenario Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 5.2.1 Reference Materials . . . . . . . . . . . . . . . . . . . . . . . . 34 5.2.2 Concept Sketches . . . . . . . . . . . . . . . . . . . . . . . . . 37 5.3 Technical Exploration and Implementation . . . . . . . . . . . . . . . 39 5.3.1 Simulation Engine . . . . . . . . . . . . . . . . . . . . . . . . 39 5.3.2 3D Assets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 5.3.3 IR Camera . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40 5.3.3.1 Implementation . . . . . . . . . . . . . . . . . . . . . 42 6 Results 45 6.1 Simulator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45 6.1.1 Format . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45 6.1.2 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46 6.1.3 Traffic Accident . . . . . . . . . . . . . . . . . . . . . . . . . . 47 6.1.4 Building Fire . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 6.1.5 AED Delivery . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 6.1.6 LiveView Controls . . . . . . . . . . . . . . . . . . . . . . . . 50 6.1.7 Modeling New Scenarios . . . . . . . . . . . . . . . . . . . . . 51 6.2 Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52 6.2.1 Interview with the HEMS Coordinator . . . . . . . . . . . . . 52 6.2.1.1 The HEMS Coordinator Role . . . . . . . . . . . . . 52 6.2.1.2 Evaluation of the Simulator . . . . . . . . . . . . . . 53 6.2.1.3 Evaluation of Prior Theory . . . . . . . . . . . . . . 53 6.2.1.4 Traffic Accident . . . . . . . . . . . . . . . . . . . . . 53 6.2.1.5 Building Fire . . . . . . . . . . . . . . . . . . . . . . 53 6.2.1.6 AED Delivery . . . . . . . . . . . . . . . . . . . . . . 54 6.2.1.7 Realism . . . . . . . . . . . . . . . . . . . . . . . . . 54 6.2.1.8 Strategic Planning . . . . . . . . . . . . . . . . . . . 54 x Contents 6.2.2 Interview with the Company Advisor . . . . . . . . . . . . . . 55 6.3 Outcomes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57 6.3.1 Facilitators of Early Situational Awareness . . . . . . . . . . . 57 6.3.2 Design Challenge . . . . . . . . . . . . . . . . . . . . . . . . . 59 7 Discussion 61 7.1 Findings and Future Development . . . . . . . . . . . . . . . . . . . . 61 7.2 Ethical and Privacy Considerations . . . . . . . . . . . . . . . . . . . 61 7.3 Project Motivations and Research Outlook . . . . . . . . . . . . . . . 62 8 Conclusion 63 Bibliography 65 xi Contents xii List of Figures 2.1 Drone hangar in Mölndal, from which drones are automatically de- ployed upon identification of an emergency call requiring DEMS. . . . 4 2.2 Test flight field utilized for daily drone testing to maintain high DEMS quality standards. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 2.3 Proportional distribution of emergency types addressed by DEMS in the Mölndal operational area . . . . . . . . . . . . . . . . . . . . . . . 5 3.1 DEMS Mission Control . . . . . . . . . . . . . . . . . . . . . . . . . . 12 3.2 Extended Spiral View mission profile . . . . . . . . . . . . . . . . . . 13 3.3 360° Rotation mission profile . . . . . . . . . . . . . . . . . . . . . . . 13 3.5 Point of Interest mission profile . . . . . . . . . . . . . . . . . . . . . 14 3.4 Track Line mission profile . . . . . . . . . . . . . . . . . . . . . . . . 14 3.6 Track Objecet mission profile . . . . . . . . . . . . . . . . . . . . . . 15 4.1 Overview of the development process. . . . . . . . . . . . . . . . . . . 24 5.1 Nature reference material used for environmental modeling. . . . . . . 34 5.2 Comparison of human visibility in IR versus color camera imagery. . . 34 5.3 Screenshots from LiveView test flight recordings over Hillerstorp, Mölndal. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35 5.4 AED delivery reference materials. . . . . . . . . . . . . . . . . . . . . 36 5.5 Concept sketch of the traffic accident scenario. . . . . . . . . . . . . . 37 5.6 Concept sketch of the building fire scenario. . . . . . . . . . . . . . . 37 5.7 Concept sketch of the AED delivery scenario. . . . . . . . . . . . . . 38 5.8 Implementation of the IR camera in Byte Conveyor’s gunship simulator. 40 5.9 IR camera usage in the "Tigers of Steel" YouTube video. . . . . . . . 41 5.10 Screenshots from the Simterm application. . . . . . . . . . . . . . . . 42 5.11 The IR implementation in the simulator. . . . . . . . . . . . . . . . . 43 6.1 Navigation flow from the main menu to the available emergency scenarios. 46 6.2 The main menu in the simulator, showing a list of available scenarios. 47 6.3 Traffic accident scenario on a country road involving multiple vehicles, including an overturned car emitting smoke, surrounded by trees to enhance simulation realism. . . . . . . . . . . . . . . . . . . . . . . . 47 6.4 Close-up of the overturned car in the traffic accident scenario. . . . . 48 xiii List of Figures 6.5 Building fire scenario in a small industrial area at night, with smoke emanating from rooftop vents; employees have evacuated and are observing from a distance. Infrared imaging is utilized to assess the scene due to limited nighttime visibility. . . . . . . . . . . . . . . . . 49 6.6 AED delivery scenario depicting the drone’s approach to a residential area. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 6.7 AED delivery scenario showing the drone at the drop point releasing the AED. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50 6.8 Scene template established for rapid modeling of new scenarios, in- cluding sample terrain, vegetation, lighting, a person, a car, and a drone prefab. The drone prefab encompasses components such as cameras, UI, flight patterns, and IR management with a replacement shader for an IR appearance. . . . . . . . . . . . . . . . . . . . . . . . 51 xiv List of Tables 2.1 Types of emergencies . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 4.1 Proposed framework for designing an effective scenario that facilitate early SA. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 5.1 Traffic accident scenario outlined using the proposed framework. . . . 31 5.2 Building fire scenario, defined using the proposed framework. . . . . . 32 5.3 AED delivery scenario structured using the proposed framework. . . . 33 xv List of Tables xvi 1 Introduction Emergency services often face significant challenges in reaching critical locations quickly, leading to preventable loss of life. This issue is particularly acute in remote or heavily congested areas, where traditional emergency response methods are con- strained by geographical or infrastructural barriers [1]. The reliance on conventional methods still leaves gaps in addressing emergencies in geographically or logistically challenging locations. These gaps underscore the need for technologies, such as drones, to complement existing emergency response strategies and improve overall outcomes. The integration of drones into emergency services, known as Drone-Enhanced Emer- gency Medical Services (DEMS), has grown significantly in recent years. This growth involves applications such as delivering AEDs for out-of-hospital cardiac arrests (OHCA), triaging patients during mass-casualty events, and transporting critical medical supplies. DEMS has greater benefits than traditional emergency medial services (EMS) meth- ods for certain tasks, particularly in reducing time-to-intervention, which is critical for improving survival rates in emergencies [2] [3]. This is crucial for OHCA cases, where delayed emergency response is a major factor in low survival rates, as each minute of delay decreases the chance of survival by 7–10 % [4]. Additionally, DEMS enhances situational awareness (SA) through sensors and cameras, enabling emergency teams to locate victims and prioritize resources more effectively, as well as to improve overall response outcomes [2] [3]. Everdrone is a Swedish company that specializes in DEMS, delivering medical equipment and providing real-time observation using drones with a system called LiveView [5]. LiveView provides a video feed from a drone to observe the emergency situation before the first emergency unit arrives. This video feed is primarily monitored by Helicopter Emergency Medical Services (HEMS) coordinators, who are responsible for evaluating the scale of the emergency and the medical conditions involved in order to make informed decisions about deploying aerial resources. The presented advantages with DEMS presuppose that HEMS coordinators can effec- tively utlitize and incorporate these systems. To support this, the aim of this thesis is to develop a simulator in collaboration with Everdrone that simulates LiveView and replicates common emergency situations. This enables HEMS coordinators to gain practical experience and develop early SA without real-world risks, thereby 1 1. Introduction fostering the effective adoption of DEMS in the development of emergency services. This thesis is particularly relevant within the field of interaction design as it focuses on understanding the human factors that influence how emergency personnel learn and utilize a system like LiveView. Efforts will be allocated for understanding an already existing system and how to effectively simulate features and contexts that bring educational value. To support this educational value, the thesis will conduct user studies to identify user needs and learning objectives. Such user research and knowledge of how to design for a specific purpose lean on existing research in the field of interaction design. Given that decisions made by HEMS coordinators through LiveView are already conducted remotely, simulating LiveView can further amplify this sense of physical distance from the actual emergency. This is related to telepresence, which is a concept in interaction design studying user interaction in remote environments as if they were physically present. In the context of this thesis, telepresence is crucial because it ensures that the user feels engaged and responsible for their actions even though it is a simulated experience. 1.1 Research Question Given the need to address the educational values discussed above, this thesis focuses on identifying design considerations for developing a simulator that assists HEMS coordinators with early SA in DEMS. Thus, the research question guiding this thesis is the following: What are key considerations in designing a simulator for a drone-enhanced emergency medical system to facilitate early situational awareness during crises? 1.2 Design Challenge This project will not focus on integrating the simulator into Everdrone’s existing system, however, its design and chosen technologies must allow for future integration. The simulator is intended to conceptualize a new domain that Everdrone may explore in the future. To illustrate how it can facilitate early SA, the simulator must model scenarios that are both common and demand effective use of key LiveView features to achieve useful results. 2 2 Background This chapter provides background for the thesis and is divided into two parts. Section 2.1 outlines Everdrone’s work and their current progress, while section 2.2 discusses the advantages of using drones in emergency situations and highlights the value of simulation-based learning for DEMS. Much of the material is drawn from unpublished Everdrone documents [6] [7], which were granted for early access as part of this project. 2.1 Overview of Everdrone Since 2021, in collaboration with Västra Götalandsregionen (VGR), Everdrone, together with the Emergency Medical Dispatch Center, SOS Alarm, and Karolinska Institutet, has been working to deliver defibrillators using drones. This has been done for suspected OHCA cases across the region and recently they started using real-time observation using the LiveView system. VGR is a pioneer in the field of drone-based AED deliveries which have been made possible through close cooperation between multiple stakeholders and a shared vision of leveraging technology to save lives. Currently, Everdrone’s drones operate in six areas within Västra Götaland: Fiskebäck, Torslanda, Kungälv, Trollhättan, Vänersborg, and, most recently, Mölndal. Together, these locations serve a population of approximately 250,000. Since the project’s inception, nearly 300 AEDs have been delivered via drones, with each deployment thoroughly evaluated and improved to ensure maximum efficiency and safety. As part of projects like PreVis 2, Everdrone and VGR also began testing drone-based imaging in 2022. This technology, later called LiveView, enables real-time video transmission from incident sites directly to emergency dispatch centers and response teams. The initiative aims to provide better decision-making support and improve safety during emergencies. The project has now advanced to a new phase, testing the system in real-world environments with customized hardware and software. This stage focuses on further validating and refining the drone system for continuous use in prehospital care and emergency services. 3 2. Background Figure 2.1: Drone hangar in Mölndal, from which drones are automatically de- ployed upon identification of an emer- gency call requiring DEMS. Figure 2.2: Test flight field utilized for daily drone testing to maintain high DEMS quality standards. 2.1.1 Types of Emergencies One of the operational areas for Everdrone’s DEMS is a subregion of Mölndal, covering approximately 100 km2. The area is home to 62 500 residents and covers most of Mölndal city. Everdrone has classified various types of emergencies in this area, which are listed in Table 2.1. Table 2.1: Types of emergencies. Cardiac Arrest Building Fire Chimney Fire Outdoor Fire – Vehicle Outdoor Fire – Container Outdoor Fire – Terrain Traffic Accident – Multiple Vehicles Traffic Accident – Single Vehicle Traffic Accident – Small Motor Vehicle Traffic Accident – Pedestrian Railway – Collision/Derailment Railway – Fire Railway – Other Railway – Pedestrian Hit Aviation Incident Tram/Other Rail Traffic Hazardous Material Release – Gas Hazardous Material Release – Other Explosion Drowning Diving Accident Boat Accident Landslide Ongoing Deadly Violence 4 2. Background Using approximately 90 days of incident data collected by Everdrone from the Mölndal operational area, an average of 1.5 alarms per day was recorded, with a maximum of 4 alarms in a single day (12 alarms during the busiest week). Figure 2.3 illustrates the distribution of these emergencies. This data shows that the three most common emergency types are traffic accidents involving vehicles, building fires, and traffic accidents involving pedestrians. 35% 25% 11% 10% 6% 5% 4% 2%1%1% Traffic Accident - Vehicle Building Fire Traffic Accident - Pedestrian Defibrillator Outdoor Fire - Vehicle Outdoor Fire - Terrain Outdoor Fire - Container Other Drowning Hazardous Material Spill Figure 2.3: Proportional distribution of emergency types addressed by DEMS in the Mölndal operational area 2.1.2 Current Flight Procedures The current flight procedure begins when the SOS call center receives an alarm, where it is determined if the situation requires the deployment of the Everdrone DEMS. However, some emergencies, such as OHCA, automatically alerts the Everdrone DEMS without active participation from an SOS call center employee. If the Everdrone DEMS is alerted, it receives the emergency coordinates and automatically initiates an autonomous preflight procedure involving a security protocol with various safety checks. The drone pilot is only required to request clearance from air traffic control, which usually takes 15 to 30 seconds, depending on external factors like airspace response times and other preflight procedures. Once the preflight procedure has been completed, the drone is automatically deployed from its hangar and ascends to an altitude of 65 meters. During the flight to the target, the route is calculated with speed and safety in mind, meaning the path may not always be the shortest, as it prioritizes less crowded areas and low-traffic routes. The drone is equipped with two cameras systems, one is monitored by the HEMS coordinator through a LiveView stream, accessible in a tablet app called DEMS Mission Control. The second camera system, including a forward- and downward- facing camera, is used by the drone pilot to control the drone through a separate application with additional flying controls. The reason for this separation is to limit advanced controls and simplify the graphical user interface (GUI) for the HEMS coordinator, as well as to give the drone pilot limited visibility on sensitive information. The HEMS coordinator’s GUI displays the location of the emergency 5 2. Background and the estimated time to arrival (ETA) along with camera controls. The system dynamically adjusts the video quality to network conditions, prioritizing image clarity over frame rate to maintain a clear picture even if motion appears less smooth. The LiveView camera is always fixed on the current target point. Upon arrival at the incident scene, the drone has two predefined movement patterns based on the type of emergency, referred to as mission profiles. If the emergency is an OHCA and requires an AED to be delivered, the drone flies toward the drop point until it is stationed overhead. It then descends to 30 meters, from which an AED is lowered to the ground using a rope. As the AED has been successfully lowered down, the rope is released and a delivery photo is automatically taken and becomes accessible via the DEMS Mission Control app. This is done while the drone ascends back to 65 meters, after which it returns to the hangar. A challenge in the cases of AED deliveries is to find a suitable drop point for the AED. The initial drop point is usually the caller’s position and, since these calls are often dialed from indoors, the drop point will be on top of a roof. Therefore, the drone pilot has to update the drop point to a suitable position that is easy to reach for the caller, avoiding traffic and pedestrians. If the type of emergency does not require an AED delivery, the drone pauses 100 meters from the emergency location, locks the camera on the target, and performs a 360-degree orbit. The HEMS coordinator can pause the orbit, then change direction or update the center point of orbit by selecting a new location in the video feed before resuming movement. After approximately four minutes of orbit, the drone returns to the hangar. A feedback form in the DEMS Mission Control app appear post-mission to collect user experiences on video stream quality and system performance, ensuring continuous improvement. 2.2 Advantages of Simulating DEMS Simulating DEMS can help to advance its development from controlled proof-of- concept experiments towards demonstrating its potential and role in more specific scenarios. In the past decade, pilot projects and industry initiatives have demon- strated the potential of drones in emergency response. Studies have shown that drones can reduce time-to-intervention in situations such as OHCA and post-disaster triage, where triage means to assess individual medical conditions [8] [9]. However, most efforts remain as a proof-of-concept in a controlled environment, but simulating DEMS can help to further demonstrate and comprehend the potentials and benefits it brings. This is a motivator for Everdrone to pursue this thesis. Simulations offer key advantages over traditional field-based practices by simplifying complex procedures in a safe and accessible manner. Research highlights improved learning outcomes by simulating scenarios such as advanced surgical procedures [10]. Studies further demonstrate that simulations help manage cognitive load more effectively compared to field-based training [11]. By providing a risk-free environment, 6 2. Background these simulations enable training in high-risk areas, making education both safer and more accessible. Simulation-based training strengthens SA, a critical factor in effective emergency response involving DEMS. Early SA allows emergency responders to assess emer- gencies quickly, make informed decisions, and determine resource needs. Studies in military wargame simulations show that simulations improve mission success by enhancing SA in areas such as spatial awareness, detection accuracy, target prioriti- zation, decision-making speed, and team coordination [12]. These findings suggest that improving SA through simulations can lead to better outcomes in high-stakes scenarios, including DEMS-related emergencies. Training responders on simulated DEMS operations can also promote safety by helping them recognize and mitigate hazards like chemical spills or unstable environments. Improved SA supports better communication and collaboration, allowing responders to share incident data, coordinate actions, and achieve more effective results. Finally, while the Sun et al. study does not examine standard operating procedures (SOPs) directly, simulation environments, such as those used for military wargames, demonstrate the potential for structured feedback loops that can inform the de- velopment of standard operating procedures. Applying similar simulation-based methods to DEMS can support the design and refinement of SOPs tailored to specific emergency scenarios. 7 2. Background 8 3 Theory This chapter lays the theoretical foundation for the thesis by introducing essential concepts, frameworks, technologies, and relevant research areas. It explores key domains within interaction design and Everdrone’s technological capabilities, includ- ing the DEMS Mission Control app, LiveView functionality, and various mission profiles. Additionally, it presents the METHANE framework which provides a guided approach for assessing emergencies, as well as other critical considerations across different types of emergencies. Finally, the chapter discusses practices in 3D modeling relevant to the research. 3.1 Interaction Design Theory This thesis project falls within the field of interaction design, a field that focuses on designing for exchanges of information between a user and a system. Interaction design encompasses several subfields relevant to this thesis, which are listed and explained in this section. 3.1.1 Graphical Interfaces The field of research on designing graphical interfaces investigates how visual compo- nents and interactive elements can improve the usability and accessibility of systems. Graphical interfaces also emphasize aesthetics, with the aim of evoking an emotional response, such as feeling calm, productive, or explorative, through the experience of interacting with the system [13]. Furthermore, the role of visual hierarchy and layout are crucial for guiding the user’s attention. Well-designed graphical interfaces also ensure accessibility, making systems usable for a diverse audience, including those with disabilities. 3.1.2 User Research An important field within interaction design is user research, which involves the collection and analysis of user data to either develop an early understanding of a problem and user needs or to refine an existing design [13]. There are various methods for collecting user data, including interviews, surveys, and observations. These methods can be categorized as either quantitative or qualitative. Quantitative methods measure user data through metrics and statistical analysis, while qualitative 9 3. Theory methods measure user data in deeper and more contextual approaches, such as in-depth discussions. By applying appropriate data collection methods and analysis techniques, researchers can therefore make data-driven and qualified assumptions about the problem and/or design. 3.1.3 User-Centered Design (UCD) UCD is a subfield within interaction design focused on addressing the specific needs and contexts of users [14]. The UCD process begins by conducting general user research about the problem space and user needs. By analyzing this data, assumptions can be made about requirements and solutions for the problem and needs. The UCD process often involves the end users throughout the process to ensure that the design aligns with their expectations and needs. 3.1.4 Prototyping Prototyping is another fundamental field within interaction design, where abstract concepts are progressively transformed into tangible forms. The prototype, often referred to as an artifact, serves as a tool to explore, discuss, test, and refine design ideas in isolated stages [13]. In the context of complex systems, individual components can be prototyped independently, allowing development to proceed even when other parts of the system remain undefined. Prototyping is inherently iterative, enabling continuous refinement of the design through evaluation and feedback, particularly from end users to support UCD. The iterative prototyping process fosters early user participation in the design phase, which can enhance the relevance and usability of the final product. To support this approach, methods such as co-design have been developed, in which end-users actively participate in the full design process while the designer facilitates and manages their contributions. 3.2 Designing for Simulated Environments Simulated environments are often virtual, and designing such experiences is associated with envoking immersion in an often remote space. The definition of a virtual simulation, and related theories like telepresence, helps to understand how to design for such simulated experiences. This will be covered in this section. 3.2.1 Virtual Simulations A virtual simulation involves experiencing a digital world that can extend beyond the constraints of reality, unlocking new possibilities and opportunities that are otherwise inaccessible in the physical world. Simulations are typically designed in a structured manner and, when created for educational purposes, fall under the concept of simulation-based learning. This concept is being applied in many different 10 3. Theory disciplines and promotes the development of knowledge and skills within a safe and controlled environment [15]. 3.2.2 Telepresence Telepresence refers to the sensation, perception, or illusion that something or someone exists at a particular time and place, even though they are not physically present. It enables interaction across distances by simulating sensory experiences that bridge the gap between users and their remote environments. This illusion of presence is achieved by integrating multiple media, such as audio, video, and data, to replicate real-world interaction dynamics as closely as possible. Historically, telepresence stems from basic telecommunications technologies such as the telephone, which extended human communication by transmitting voice across vast distances. Over time, advancements in audio, video, and multimedia systems have expanded the scope of telepresence to include immersive and interactive experiences. Modern telepresence services often involve high-quality audio, real-time video, and sometimes virtual or augmented environments to allow users to interact as if they were co-located [16]. Human factors play an important role in telepresence. Usability, emotional engage- ment, and service quality directly affect the effectiveness of telepresence systems. For instance, intuitive interfaces and high-quality media synchronization are essential to create seamless remote interactions [16]. Applications of telepresence can include remote collaboration, education, healthcare, and entertainment. It facilitates participation in meetings, surgeries, and class- rooms from afar, reducing accessibility barriers and enabling new opportunities for interaction in both professional and personal contexts [16]. 3.3 Everdrone’s Technologies This section provides an overview of the key technologies and capabilities developed by Everdrone. It details the functionalities of the DEMS Mission Control app and the various mission profiles used to optimize drone operations. These technologies are integral to improve SA, enabling rapid response, and support decision-making in critical scenarios. 3.3.1 DEMS Mission Control DEMS Mission Control is a tablet app used by HEMS coordinators to observe the LiveView feed from the drone. The application, shown in Figure 3.1, includes camera controls with three image modes (IR, color, and IR + color) along with three zoom options (1x, 5x, and 30x). However, current privacy regulations enforced by VGR prevents the usage of the zoom functionality. Thus, the zoom controls are not currently visible in the app. If the drone is performing a 360-degree orbit, additional pause and rotation buttons appear. Pressing pause stops the drone in place, allowing 11 3. Theory the user to update the target point by tapping anywhere in the video feed or on the map in the top-left corner. Additionally, the orbit direction can be changed when paused using the left or right rotation buttons. Figure 3.1: Screenshot of the DEMS Mission Control, displaying a combined IR and color image during an active mission. 3.3.2 Mission Profiles Everdrone envisions their drones to be operated using preconfigured flight patterns referred to as mission profiles. The current stage of development offers two mission profiles, the 360-degree orbit and the AED delivery as previously described in 2.1.2. There are also additional mission profiles in Everdrone’s pipeline which are illustrated below: 12 3. Theory Figure 3.2: The mission profile Extended Spiral View enables the drone to follow an orbital flight pattern around a fixed point of interest. It captures images from all angles (facades A, B, C, D) at a distance of 50–100 meters and concludes with a centered aerial view (G) of the entire scene. This profile is designed to provide comprehensive visual information and can be repeated with different cameras for varied perspectives. Figure 3.3: The mission profile 360° Rotation, unlike the 360-degree orbit, employs a stationary flight pattern where the drone rotates in place to capture a panoramic view of the surroundings. The camera points outward, capturing images from all directions to help locate the source of the emergency or assess the extent of the damage. This mission can be repeated with different zoom levels or altitudes for additional perspectives. 13 3. Theory Figure 3.5: The mission profile Point of Interest (POI) uses a relocation flight pattern to reposition the drone to a new spot on the map. This allows adjustments to be made for the drone’s location to explore a different area. The mission can be repeated to relocate the drone to multiple points of interest as needed. Figure 3.4: The mission profile Track Line utilizes a custom flight pattern to provide focused imagery of a specific area of interest. The drone follows a pre-defined route, capturing images from an adjustable angle along the way to search for victims or evidence of an accident. This mission can be repeated with different cameras to gather diverse perspectives. 14 3. Theory Figure 3.6: The mission profile Track Object employs a dynamic flight pattern to maintain a continuous view of a moving target. The drone camera remains centered on the selected object, capturing images from a constant angle as it follows the object’s movement. This profile is designed to monitor the target’s behavior or location effectively. 3.4 METHANE Framework The METHANE framework is a widely recognized method for assessing critical information during emergency incidents [17]. It is commonly used by emergency services and first responders to ensure clear and consistent reporting, which is vital for coordinating efforts, making informed decisions, and effectively managing resources. The framework emphasizes a structured approach for reporting, ensuring that essential details are conveyed efficiently and accurately. This framework is used by first responders for many of the incidents that involve Everdrone’s DEMS. The acronym METHANE stands for the following components: 3.4.1 M - Major Incident By identifying whether the situation is a major incident or an emergency that requires additional resources and specialized responses, responders and decision-makers can quickly assess the scale and urgency of the situation, prioritizing resources accordingly. 3.4.2 E - Exact Location Providing the exact location of the incident is important for quick response times. This includes precise geographic coordinates, street addresses, or landmarks to direct responders to the scene, avoiding any ambiguity in navigation. 15 3. Theory 3.4.3 T - Type of Incident This refers to the nature of the emergency or incident, such as a fire, chemical spill, natural disaster, or medical emergency. Understanding the type of incident allows responders to determine the appropriate tactics, resources, and specialized teams required for effective mitigation. 3.4.4 H - Hazards This involves highlighting any hazards present at the scene, including environmental dangers, toxic materials, structural instability, or threats to responders. Identifying potential risks early on helps teams to plan for safety and take necessary precautions. 3.4.5 A - Access Reporting on the accessibility of the incident scene is important to ensure that emergency vehicles, personnel, and equipment can reach the site without unnecessary delay. This may include details about road conditions, blocked routes, or areas of congestion that could hinder response efforts. 3.4.6 N - Number of Casualties Providing an accurate count of casualties, including the number of injured or deceased individuals, is critical for resource allocation and prioritizing treatment. It helps emergency responders assess the scale of medical intervention needed and dispatch appropriate teams. 3.4.7 E - Emergency Services This final component refers to the types of emergency services currently involved or required at the scene, such as fire, medical, police, or specialized units. It helps ensure that the appropriate services are mobilized and that responders are coordinated effectively. 16 3. Theory 3.5 Theoretical Applications of DEMS This subsection explores theoretical applications of drones in various emergency scenarios, highlighting their role in enhancing SA, improving response strategies, and supporting decision-making processes. All the following examples are drawn from an unpublished Everdrone document [7], which was granted for early access as part of this project. 3.5.1 Traffic Incident During traffic accidents, a comprehensive bird’s eye view of the incident allows first responders to quickly assess the scene’s extent, pinpoint the locations of casualties, identify potential risks such as leaking fuel. It also helps them determine the best access route to the incident, considering current traffic conditions to ensure that first responders can reach the scene as rapidly and safely as possible. By controlling the camera, the user can zoom in for a more detailed view of specific areas. For instance, the feature can be used to determine the extent of vehicle damage, which will inform teams on the necessary extraction equipment and techniques. These images also act as a vital record of the incident, providing crucial evidence for later investigation. Interpreting images from the drone’s thermal imaging camera could be key for locating obscured or hidden victims. Warm signatures indicate the presence of people or animals who might not be visible in the high-resolution imagery, especially in low-light or poor visibility conditions. 3.5.2 Building Fire The high-resolution camera provides a comprehensive view of the fire, enabling quick assessment of the scale and spread of the blaze. Emergency call takers can use this real-time information to guide occupants safely out of the building, avoiding areas of heavy smoke or flames. The drone’s imagery also assists emergency response coordinators in assessing the severity and dynamics of the fire, identifying potential risks such as structural damage, or areas where the fire may spread. This information is crucial for developing an effective response strategy, including resource allocation and firefighter deployment. Firefighters on the ground benefit from the bird’s eye view provided by the drone, enabling them to identify the most effective points of entry, areas to avoid due to intense heat or structural instability, and potential victim locations. The thermal imaging camera is an essential tool for seeing through heavy smoke and locating hotspots that might indicate trapped victims or areas where the fire could reignite. Quick and accurate interpretation of these images is vital for planning rescues and firefighting tactics. 3.5.3 Forest Fire The drone provides a comprehensive aerial view, which allows for assessing the extent and direction of the fire. By identifying the fire’s edge, the wind direction, and the status of roads and terrain, coordinators can effectively plan for both extinguishing 17 3. Theory the fire and evacuation efforts. This information allows for the accurate prediction of the fire’s path and spread, crucial for the optimal deployment of firefighting resources. Firefighters on the ground utilize the drone’s real-time data to understand the fire’s behavior and the lay of the land, determining safe zones and potential hazards such as falling trees or shifting fire fronts. The zoom feature allows for a closer look at specific areas, offering a clearer view of fire intensity and aiding in tactical decision-making. The drone’s thermal imaging camera uncovers hidden hotspots, potential threats for reignition, and aids in locating wildlife or individuals obscured by smoke or vegetation. Swift and accurate interpretation of these images can expedite rescue efforts and help ensure a comprehensive fire suppression strategy. 3.5.4 Drowning Incident The high-resolution camera affords a wide-ranging view of the coastal area and open water, quickly providing emergency call takers with the scope of the situation. This information helps them guide first responders effectively, particularly in situations where casualties may be spread over a large body of water or along an extensive shoreline. The thermal imaging camera of the drone is essential in these operations. It can detect the heat signatures of people in the water or along the coastline, even in low visibility or at night. A quick and accurate interpretation of these thermal images can significantly speed up the rescue efforts, ensuring no victims are overlooked. 3.5.5 Railway Accident The length and linear nature of rail accidents often mean casualties and damage may be spread over a considerable distance. The drone’s bird’s eye view allows emergency call takers to gather immediate information about the scope of the incident and provide crucial guidance to first responders on the ground. Emergency response coordinators can use the drone’s imagery to formulate an effective strategy. Key information such as the condition of the train, the extent and location of a potential derailment, and the status of the surrounding environment can be gathered. Moreover, it can help in assessing the state of the tracks, aiding in the planning of recovery and repair efforts. Firefighters and first responders on route to the scene can assess damage to individual railcars, detect potential hazards, and plan their approach effectively. High-resolution images can also help distinguish between involved parties and bystanders, aiding in the prioritization of rescue operations. The drone’s thermal imaging camera may play a critical role in identifying victims along the track who may be obscured by debris or difficult terrain. By spotting warm signatures, quick and accurate interpretation of these images can expedite rescue efforts and ensure no victims are overlooked. 3.5.6 Chemical Accident The drone’s high-resolution camera provides a comprehensive aerial view of the scene, enabling emergency call takers to assess the incident’s scope quickly. Information regarding the chemical spill or leak’s size, its spread, and potentially affected areas can be gathered promptly, aiding in the efficient coordination of emergency services. 18 3. Theory Emergency response coordinators can use the drone’s imagery to formulate an effective containment and evacuation strategy. By identifying the boundaries of the contaminated area, they can efficiently direct HAZMAT teams, who are trained to handle hazardous materials or dangerous goods, and plan safe routes for evacuations. For HAZMAT teams and first responders at the scene, the drone’s images are invaluable. By zooming in on specific areas, they can evaluate the chemical’s source, whether it is a ruptured tank, a damaged pipeline, or a derailed railcar, and plan their approach accordingly. If visible identifiers like placards or labels are present, the high-resolution view can also help identifying the chemical substance involved. The thermal imaging camera can play a crucial role, as some chemicals react thermally. For instance, a cooling or heating area can indicate a chemical reaction that might pose additional risks. Quick and accurate interpretation of these images can help in managing such hazards. 3.5.7 Flooding With the drone’s high-resolution images, the scale of the flood can quickly be assessed, along with the spread of water, and the areas critically affected. This enables deployment of appropriate resources and provides accurate information to the responding teams. Emergency response coordinators use the drone’s imagery to formulate the rescue and evacuation strategy. It allows estimation of the water’s depth in different areas, identify blocked or safe pathways, and assess the stability of structures. These insights facilitate the efficient orchestration of rescue operations. Search and Rescue (SAR) teams rely heavily on the drone’s imagery to identify stranded victims, plan the safest routes to reach them, and detect floating debris or other hazards in the water. The drone’s zoomable high-resolution camera becomes their eyes in the field, allowing for swift, focused, and safe operations. The drone’s thermal imaging camera is an invaluable tool in these scenarios, capable of detecting heat signatures of people in the water or obscured by terrain or structures, aiding in swift and precise rescues, especially during night operations or in low visibility conditions. 3.5.8 Landslide and Avalanche The drone’s high-resolution camera swiftly offers a bird’s eye view of the affected area. This broad perspective enables emergency call takers to comprehend the incident’s scope, including the extent of the landslide or avalanche and the areas most critically impacted. This information is invaluable for guiding first responders and launching immediate rescue operations. Emergency response coordinators can use the drone’s imagery to devise an effective SAR strategy. By assessing the stability of the terrain, the extent of the displaced material, and any blocked routes, they can efficiently organize rescue teams and plan safe paths for their operations. For SAR teams at the scene, the drone’s zoomable images can prove instrumental. They can discern signs of trapped victims, such as exposed clothing or equipment, or unusual land deformations. These high-resolution images can also assist in determining the safest approach for the rescue, avoiding areas at risk of secondary slides. The thermal 19 3. Theory imaging camera can detect heat signatures of people buried under the snow or debris, who might not be visible in the high-resolution imagery. Fast and accurate interpretation of these thermal images can drastically speed up the rescue efforts and ensure no victims are overlooked. 3.5.9 Oil Spill From an aerial perspective, the high-resolution camera enables early assessment of the scale of the oil spill, including the extent of the spread and the areas most critically affected. This visual intelligence can be used to guide the deployment of resources and personnel in a timely manner. Emergency response coordinators, equipped with the drone’s imagery, can then formulate a containment and cleanup strategy. It allows them to define the boundaries of the spill, plan the optimal deployment of booms and skimmers, and determine safe paths for vessels involved in the cleanup. Post-spill, the drone’s footage provides a valuable record for investigation teams and environmental agencies. Insights into the cause of the spill, the effectiveness of the response, and the extent of the environmental impact can all be drawn from this visual evidence. 20 3. Theory 3.6 3D Environment Design 3D environments are virtual copies of either real or fictional environments such as natural landscapes, urban settings, interior spaces, fantasy worlds, or historical recreations. Designing such environments lies in the field of 3D design and involves three main components: Modeling, lighting, and rendering. The practice of 3D modeling is typically regarded as complex, and the design studio 3D Ace has compiled a comprehensive guide to better understand the field [18]. According to the guide, this technique offers numerous advantages across a range of industries, including gaming, filmmaking, architecture, automotive design, healthcare, and education. In the film industry, digital modeling provides a more cost-effective alternative to traditional set design. Within the automotive sector, it allows for the rapid development of high-resolution, testable prototypes during the early stages of design, which speeds up production and improves design precision. In healthcare, customized prosthetics and detailed surgical planning benefit from these digital tools, resulting in better outcomes for patients. Lastly, in educational settings, interactive and immersive 3D environments make abstract and complex concepts more accessible and easier to grasp. Unity, one of the most widely used 3D game engines, highlights the importance of considering how natural elements interact with the terrain in its official tutorial on landscape creation [19]. Effective scene design involves the analysis of environmental factors such as sunlight direction, water flow patterns, and the natural placement of geological features like rocks. Incorporating high-quality reference materials is also recommended, as it supports the development of visually coherent and realistic environments in which terrain features and textures are seamlessly integrated. In professional 3D modeling settings, there is a synergy between modeling artists and layout artists for designing 3D environments [18]. Modeling artists begin the modeling phase by gathering concept art to define the desired aesthetics of the environment and annotate the materials required for the scene. In the later layout phase, layout artists strategically arrange the modeled objects and structures to create a visually appealing and logically structured composition. Layout artists often work simultaneously on a top-down view of the world, mapping out landmarks and objects while establishing their spatial relationships. This step is important for shaping the viewer’s experience and interaction with the final model. Key considerations include movement paths, sightlines, and the balanced distribution of visual weight across the scene. According to 3D Ace’s guide, the following steps are useful before modeling a 3D environment [18]: 1. Define the Purpose: Identify the primary function of the environment, whether it is for a game, simulation, or visual storytelling. This decision serves as the foundation for all design choices. 2. Choose a Theme: Select a theme that aligns with the intended purpose. For example, a dystopian cityscape may suit a survival game, while a peaceful countryside 21 3. Theory could enhance a virtual relaxation experience. 3. Understand the Target Audience: Consider the users who will interact with the environment. Their expectations and preferences can influence aspects such as complexity, color choices, and level of detail. 4. Establish Visual Goals: Determine the desired visual and emotional impact of the environment. Should it evoke excitement, fear, tranquility, or curiosity? Factors such as lighting, proportions, and textures play a significant role in achieving this. 5. Define Functional Elements: Identify the necessary interactive or navigational components. In a game setting, this may include combat zones, exploration paths, or interactive objects. In educational contexts, it could involve structured spaces for learning engagement. 6. Develop a Mood Board: Compile reference images, color palettes, and texture samples to maintain consistency in visual style. This resource will guide the design process and help achieve coherence in the final scene. 22 4 Method This chapter presents the methodological approach adopted for the design and development of the simulator. At the core of this approach, design research will be performed. Design research is a methodology focusing on knowledge about designing products, services and systems in various contexts [20]. For this thesis, design research will be used to identify key considerations for designing the simulator in a way that facilitates early SA. Additionally, the Research through Design (RtD) methodology will be incorporated. In RtD, the artifact created during the design process becomes a central medium of inquiry [21]. Rather than separating design and research, RtD treats the act of designing as a form of research in itself, one that produces insights through reflection, iteration, and engagement with materials and contexts. While rooted in design research, RtD often aims to produce knowledge that is applicable beyond design, addressing broader societal, cultural, or technological questions. For this thesis, RtD enables the exploration of broader questions surrounding the usage, impact, and societal role of simulated DEMS that extend beyond the design of the simulator. To operationalize these methodologies, the project integrates several phases: Re- quirements and Stakeholder Analysis, Scenario Design, Technical Exploration, Imple- mentation, and Evaluation. This process is illustrated in Figure 4.1, and each phase is further detailed and discussed in the subsections that follow. 23 4. Method Requirements & Stakeholder Analysis Scenario Design Technical Exploration Implementation Evaluation Figure 4.1: Overview of the development process. 4.1 Requirements and Stakeholder Analysis In this phase, the focus is on laying the groundwork for the simulator by identifying its core requirements and intended use. To identify the specific features and scenarios that the simulator should support, it is necessary to gather insights into the primary end user’s needs, required skills, responsibilities, and expectations. Since the HEMS coordinators’ role is to use DEMS Mission Control and utilize LiveView, they are the primary end user of this project. Additionally, future ideas of LiveView will be gathered and analyzed to ensure that the simulator aligns not only with current needs, but also with anticipated directions for growth and innovation. To ensure the simulator’s relevance and effectiveness, it is necessary to determine to what extent it should replicate the current state of LiveView. The GUI should be usable and recognizable for users familiar with LiveView. However, the significance of graphical similarity requires further investigation. As an example, a question worth asking is whether less realistic graphics still can be useful for gaining early SA. Furthermore, it is important to determine the importance of representing critical operational conditions such as limited sight in severe weather conditions or poor connectivity. Discussions with the company advisor at Everdrone, academic research, and internal documents from Everdrone will be conducted to gather this knowledge. 24 4. Method 4.2 Scenario Design The most frequent emergency types, illustrated in Figure 2.3, will guide the selection of scenarios for the simulator. By focusing on a subset of these emergencies, the project scope can be further managed while still providing a credible representation of real-world situations. To help structure and ensure quality for the scenario design process, there are relevant approaches and insights found in the METHANE framework, theoretical applications of DEMS, and the praxis of 3D environment design. These are combined in a proposed framework that outlines core components to define each scenario in the design process, shown in Table 4.1. Table 4.1: Proposed framework for designing an effective scenario that facilitate early SA. Design Component Description Main Purpose Identify the main purpose of the scenario. Target Audience The users that will simulate the scenario. Reference Materials Reference materials for the target scenario visuals. Incident Scale The emergency urgency and ways of determining it. Location Identification Methods Ways to pinpoint the exact emergency locations. Incident Type Recognition Ways to determine the type of incident. Hazard & Objects of Interest Detection Hazards, objects of interest, and ways of identifying them. Casualty Assessment / Triage Cues Casualities, medical conditions, and ways of identifying them. Access & Entry Point Evaluation Ways of identifying emergency access. EMS Presence & Needs Assessment Determine the current state and need of EMS. The reason for considering the METHANE framework, mentioned in Section 3.4, is to ensure alignment with established emergency response protocols. By structuring simulation scenarios with this framework in mind, users can train in a way that reinforces standardized SA and communication practices. Thus, the scenarios are designed to train users to identify critical information, such as whether it is a major incident or not, the exact location, the type of incident, potential hazards, access routes, number of casualties, and information on EMS already involved or required on the scene. In terms of theoretical applications of DEMS as described in Section 3.5, there are 25 4. Method insights about varying objects of interests, strategies, and efficient utilization of DEMS features depending on the type of emergency. For the scenario design, this knowledge is useful to understand what to include in the chosen scenarios. Therefore, before designing each scenario, it is important to identify and list objects of interest, strategies, and effective DEMS features, as this helps to ensure that the scenarios support early SA. Lastly, the praxis of 3D environment design as covered in Section 3.6, proposes key steps before modeling 3D environments. Taking inspiration from this theory, the steps relevant for the proposed framework is to define the scenario purpose, target audience, visual goals, functional elements, and create mood boards. Defining a scenario theme is irrelevant since none would be fictional, as opposed to game design and cinematography where it is necessary to define the aesthetical theme such as dystopian or futuristic settings. Moreover, for this simulator, the target audience is consistent across all scenarios and had already been identified as HEMS coordinators, but it is a necessary factor since it will drastically impact what elements bring early SA. Once each scenario have been defined through the proposed framework, a concept sketch will be created for each scenario that illustrates the objects of interest and layout. These sketches will be discussed with the Everdrone company advisor, ensuring that the scenarios represent the type of emergency with the authenticity required. 4.3 Technical Exploration This stage focuses on researching and evaluating technical tools for building the simulator. The evaluation will be driven by the previously identified simulator re- quirements, selected scenarios, and scalability considerations. A thorough exploration of available tools prior to implementation is important to ensure both the quality and feasibility of the final artifact. The tools under consideration include: • Simulation Engine: A game engine (e.g., Unity or Unreal Engine) or a specialized simulation framework that, if necesarry, has the capabilities of rendering realistic graphics, dynamic weather, AI-driven actors, and animations. • 3D Modeling Assets: Pre-made or custom-designed assets used to construct the simulator’s environment, based on the previously developed scenario designs. As technical exploration progresses, new insights are expected to emerge regarding the capabilities and limitations of different simulation tools. These findings can directly influence scenario design by helping refine which scenarios are most feasible and how they can be developed in a plausible manner. Certain technical constraints may require reducing scenario complexity, environmental fidelity, or interactive elements, while new technical capabilities may allow improvements that were not initially considered. The process will therefore follow an iterative approach, where technical exploration also informs and refines scenario design. 26 4. Method 4.4 Implementation This stage involves integrating all components to deliver a functional demo of the simulator. Designed for tablet use to reflect the current LiveView format, the demo will serve as a prototype, illustrating how the simulator could operate within Everdrone’s DEMS. It aims to embody both the practical insights and theoretical knowledge gained throughout the design and development process. 4.5 Evaluation After implementing the simulator, a semi-structured interview will be conducted with a HEMS coordinator to assess its effectiveness in enhancing early SA. The interview will aim to gain design research by identifying key facilitators that contribute to improved early SA, providing knowledge for answering the research question men- tioned in Section 1.1. The structure of the interview will start with an introduction to the thesis, followed by a set of questions regarding the HEMS coordinator role and obligations. Thereafter, the simulator will be demonstrated and followed by questions related to the design of each scenario. Lastly, questions in regards to realism and strategic planning will be asked. Thereafter, the final artifact and chosen technologies will be evaluated with the company advisor at Everdrone through a semi-structured interview. One goal with this evaluation is to provide design research for the research question, but the primary goal is to reflect on how the artifact and technologies accomplish the design challenge mentioned in Section 1.2, i.e. how the chosen technologies support future integrations, creations of new scenarios and experimental features for Research and Development (R&D) purposes. Additionally, through RtD with the artifact as inquiry, this evaluation aims to gain new perspectives on the future possibilities, adaptations, and development of DEMS in society. 27 4. Method 28 5 Process This chapter covers the development process of the simulator and the following sections describe the execution of the requirements and stakeholder analysis, scenario design, and technical exploration from the method, covered in chapter 4. 5.1 Requirements and Stakeholder Analysis Besides the primary users being HEMS coordinators, there are other users and stakeholders that might find interest in the simulator according to the company advisor at Everdrone. The secondary users identified include individuals interested in Everdrone’s DEMS platform who wish to explore the potential benefits of LiveView in real emergency scenarios. These may include investors, municipal administrators, and similar stakeholders. Additionally, there is internal interest in the simulator for research and development purposes within Everdrone. As Everdrone’s DEMS platform continues to expand into new domains, further stakeholders and user groups may emerge in the future. The primary users must be proficient in utilizing the right tools at the right time and confident in understanding the available features before an emergency arises. Furthermore according to Blecher, one key skill required for their role is image assement, meaning to analyze and detect critical objects and medical needs [22]. This skill and knowing how to utilize tools efficiently need to be practiced to strengthen SA and is expected to be supported in the simulator. According to Blecher, the significance of the visual similarity with LiveView does not need to be identical, but recognizable. The interest from Everdrone is to understand how a simulated experience of LiveView can look like to envision its role in general [22]. To accomplish this, the importance of visual similarity is low. However, based on theory about telepresence covered in 3.2.2, the visuals play an important role of the level of immersion and emotional engagement. Thus, if the visuals can reflect the severity and importance of a real emergency, it could be a significant factor for facilitating early SA. Therefore, an assumption is made that the visuals do not need to be completely identical but should be recognizable and sufficiently convincing to enhance early SA. Blecher outlines potential future applications of the simulator in R&D, as well as in testing processes. The simulator is envisioned as a platform for experimenting 29 5. Process with new features, mission profiles, and scenarios. Furthermore, it can be integrated into the testing pipeline to support automated testing, thereby enhancing quality assurance and reducing deployment time. There are also future envisions for creating course modules to certify new HEMS coordinators and integrate the simulator into new systems for various purposes, e.g. integrate the simulator and courses into existing educational platforms. Thus, for the simulator to be scalable, it needs to support these envisions [22]. 5.2 Scenario Design The number of scenarios was determined based on the scope of the project, with the conclusion that designing and implementing three scenarios would be optimal. These scenarios were selected to reflect the three most common incident types in the Mölndal region, as shown in Figure 2.3. Instead of including traffic accidents involving pedestrians (11%), AED deliveries (10%) were chosen, as pedestrian-related traffic accidents are structurally similar to the most common emergencies, vehicle- related traffic accidents (35%). This decision allowed for greater scenario diversity while still representing the most frequently occurring emergency types. Moreover, the inclusion of the AED delivery scenario better highlights the range of Everdrone’s technological capabilities, as it involves a different drone flight pattern compared to the other emergency types, where the drone behavior is similar and only the modeled environment differs. Each scenario was defined through the proposed framework, described in table 4.1, and the answers to each component for each scenario are shown in the tables below. 30 5. Process Table 5.1: Traffic accident scenario outlined using the proposed framework. Design Component Answer Main Purpose To practice and demonstrate how to assess traffic accidents with LiveView. Target Audience HEMS coordinators. Reference Materials See 5.2.1. Incident Scale & Classification Minor-to-medium incident depending on medical conditions and can be based on a single none-traffic country road with few vehicles involved. Location Identification Methods Landscape is monotone, which makes pinpointing location difficult. However, a street address is present in the GUI and there is smoke from a car that can guide the user. Incident Type Recognition Single-to-multi-vehicle accident can be classified since only one of three vehicles has been majorly affected but it is unknown if the other vehicles have been involved. Hazard & Objects of Interest Detection Smoke and potential fire from a flipped car, trapped victims, and since there is a bus and family cars present, victims may be children. There might come more cars to be involved in the accident. Risk of fire spread since it is taking place in a densed and dry forrest. Additionaly, it is important to assess wether there are wild animals in the scene. Casualty Assessment / Triage Cues Casualities and trapped victims can be found efficiently using IR and zoom controls. Access & Entry Point Evaluation Road-based EMS can enter from two directions. EMS Presence & Needs Assessment No EMS involved, only bystanders. Requires emergency medical and fire assistance. 31 5. Process Table 5.2: Building fire scenario, defined using the proposed framework. Design Component Answer Main Purpose To practice and demonstrate how to assess building fires with LiveView during night, resulting in bad sight. Target Audience HEMS coordinators. Reference Materials See 5.2.1. Incident Scale & Classification Medium incident which can be based on a storage facility with smoke from two vents. Location Identification Methods A street address is present in the LiveView GUI and the facility would be known. Incident Type Recognition Potential fire based on smoke from two vents at the storage facility. Hazard & Objects of Interest Detection There is risk of flamable and explosive objects stored in the facilities. The potential fire can also spread. Casualty Assessment / Triage Cues Casualities and potential victims can be found effeciently using IR and zoom controls. Access & Entry Point Evaluation Road-based EMS can enter from the connected road. A break-point can be chosen outside the facilities. Access points to the building for fighting the potential fire can be assessed from the drone. EMS Presence & Needs Assessment No EMS involved, only bystanders. Requires emergency medical and fire assistance. 32 5. Process Table 5.3: AED delivery scenario structured using the proposed framework. Design Component Answer Main Purpose To practice and demonstrate how LiveView is used during a AED delivery. Target Audience HEMS coordinators. Reference Materials See 5.2.1. Incident Scale & Classification It is a general OHCA. Location Identification Methods The street address is reported and present in the LiveView GUI, and further location identification is not needed. Incident Type Recognition This would already have been communicated as an OHCA by the SOS call centre. Hazard & Objects of Interest Detection Even though it is not the HEMS coordinator’s responsibility to choose the drop point, it is important to understand that the drop point might be trafficated or inaccessible. Casualty Assessment / Triage Cues Assessing triage cues is not a part of this scenario. Access & Entry Point Evaluation Let the HEMS coordinator reflect about the accessibility of the AED for the bystander collecting it. EMS Presence & Needs Assessment No EMS involved, only bystanders. For this kind of emergency, EMS will always arrive based on protocol. 33 5. Process 5.2.1 Reference Materials Reference material was collected from LiveView test flight recordings and images that represent the Swedish nature. This material was used to guide the 3D environment design to help create an authentic representation of the environment that LiveView is currently utilized in. The material is shown in the following Figures, 5.1-5.4. Figure 5.1: Nature reference material used for environmental modeling. Figure 5.2: Comparison of human visibility in IR versus color camera imagery. 34 5. Process Figure 5.3: Screenshots from LiveView test flight recordings over Hillerstorp, Mölndal. 35 5. Process Figure 5.4: AED delivery reference materials. 36 5. Process 5.2.2 Concept Sketches Using the scenarios defined from the framework and reference materials, three concept sketches were created to showcase the traffic accident, building fire, and AED delivery scenario, shown in Figure 5.5-5.7. These sketches were discussed with the Everdrone company advisor. The AED delivery scenario was originally intended to be located in a public swimming area because it would require the user to find a suitable rally point, suggestively in the nearby parking lot. However, after the discussion with the advisor, this was changed to be located in a resident area instead, since that is a more common environment for Everdrone’s AED deliveries. Aside from that change, the company advisor concluded that the scenarios adequately reflect the typical emergencies to which Everdrone’s DEMS are dispatched. Figure 5.5: Concept sketch of the traffic accident scenario. Figure 5.6: Concept sketch of the building fire scenario. 37 5. Process Figure 5.7: Concept sketch of the AED delivery scenario. 38 5. Process 5.3 Technical Exploration and Implementation This section covers the insights, considerations and tradeoffs from the technical exploration of tools. The section is divided into three parts that needed exploration and evaluation. These parts are the selection of the simulation engine, 3D assets, and the IR camera implementation, which is relevant since previous IR implementations were hard to find, resulting in the need for a novel implementation. 5.3.1 Simulation Engine Various simulation engines were explored, including established drone and aerial simulators such as Microsoft AirSim and DJI Flight Simulator. However, many of these platforms primarily focus on drone control and physical behavior rather than supporting the simulation of autonomous flight. In contrast, game engines offer greater scalability and align well with existing expertise and prior project experience. Given that the aim of this project is to simulate custom scenarios featuring custom graphical user interfaces rather than detailed drone flight physics, game engines were deemed more suitable than drone-specific simulators. When selecting a game engine, two major free options were considered: Unity and Unreal Engine. While Unreal Engine is known for its high-fidelity graphics capabilities, both engines offer comparable general functionality [18]. Even though Unreal Engine had promising graphics and terrain building capabilities, it was barely usable when comparing the two since it did not seem to be optimized for macOS systems with Intel processors. Thus, choosing Unity as the game engine was decided. 5.3.2 3D Assets 3D assets offer achievable results for modeling environments but can lead to com- promises in the design. 3D assets are predesigned 3D models which are ready to be imported into a game engine to be used for building the environment. Many of these are free and by using them, development time can be significantly reduced as no time has to be spent on designing assets. Even though good results can be generated fast, it can easily limit the freedom and initial thoughts of the scenario design, examples being that scenarios that require certain objects may not be available for free which leads to compromises in the design. Many of the real applications of Everdrone’s DEMS are deployed in urban settings. Because urban environments, unlike nature environments, requires a vast amount of assets, a compromise was made to depict all scenarios in a nature setting. The assets were chosen to align with the reference material, replicating the Swedish nature. Assets were explored using the official Unity Assets Store, where many free assets can be installed directly into the Unity project. 39 5. Process 5.3.3 IR Camera Few IR camera implementations have been developed in Unity, with most focusing on thermal vision or x-ray effects, commonly used in games to reveal characters behind walls. A notable example, shown in Figure 5.8, is Byte Conveyor’s 2015 AC-130 gunship simulator [23]. This approach relied on pre-painted grayscale textures for characters, terrain heat maps, and vertex-painted vehicles, combined with shaders that dynamically inverted colors to simulate real-world IR camera modes (white-hot and black-hot). In contrast, the proposed technique offers a more realistic and adaptable simulation by supporting dynamic heat intensities and shader-based color inversion. (a) Screenshot of the gameplay. (b) Ground texture. (c) Soldier sprite map. (d) Vertex painted tank. (e) Palm meshes were custom built. Figure 5.8: Implementation of the IR camera in Byte Conveyor’s gunship simulator. Another well-executed IR camera implementation in Unity can be found in Tigers of Steel. This implementation featured a toggle for enabling and disabling IR vision. However, the available footage primarily showcased gameplay without revealing technical details about the camera’s functionality [24]. Screenshots from the gameplay are shown in Figure 5.9. 40 5. Process Figure 5.9: IR camera usage in the "Tigers of Steel" YouTube video. A more advanced approach to IR simulation is demonstrated in Inframet’s Simterm 2.1 [25], as shown in Figure 5.10. Unlike the shader-based methods commonly used in Unity applications, Simterm provides a high-fidelity thermal camera simulation by incorporating real-world IR physics. It allows users to adjust camera settings and environmental conditions, enabling a more accurate representation of infrared imaging. Simterm generates static thermal images based on pre-defined observation conditions and camera characteristics. Users can adjust: • Environmental factors: Visibility (fog, rain, snow), wind strength, back- ground temperature, and object-camera distance. • Camera settings: Brightness, contrast, field of view, polarization, electronic zoom, and noise levels. • Object properties: Heat signatures, movement, and material properties. Additionally, Simterm supports custom object and background creation, enabling users to simulate a variety of thermal imaging scenarios with high accuracy. This fea- ture makes it a valuable tool for training, research, and thermal camera development, offering a realistic way to study IR imaging without requiring physical hardware. While Unity-based implementations focus on real-time visualization for games and simulation, Simterm provides a more analytical approach, useful for testing and evaluating infrared imaging technologies. 41 5. Process Figure 5.10: Screenshots from the Simterm application. 5.3.3.1 Implementation The IR camera implementation in Unity, shown in Figure 5.11, was designed around an IRManager object that controlled the IR state for the entire scene. When the IR camera mode was toggled via the UI, the IRManager activated a replacement shader, swapping objects’ original shaders with a new one that rendered fragments in grayscale based on the object’s temperature value, IRValue. An IRValue could be assigned to an object using a MaterialPropertyBlock in com- bination with an associated script, IRObject. However, for objects with negligible heat, such as trees or rocks, an IRValue was not required. Instead, the IRManager assigned these objects a default color based on a global IRValue variable. This approach improved scalability and modularity, allowing new scenes to be modeled more efficiently without manually assigning an IRValue to every object. One potential improvement to this implementation is replacing the static IRValue assigned to objects with a black-and-white texture that represents heat distribution across their surfaces. This approach would create a more realistic simulation, partic- ularly for characters and vehicles, by capturing varying heat intensities rather than applying a uniform temperature value. 42 5. Process Another improvement could be to introduce a small variation for objects that do not have an assigned IRValue. Instead of applying a uniform global IR value, a slight random offset could be added, creating a more realistic IR representation. This would better simulate natural temperature variations across objects in a scene, such as terrain elements, where not all surfaces share the exact same temperature. Figure 5.11: The IR implementation in the simulator. 43 5. Process 44 6 Results This chapter presents the results of the thesis and is structured in three parts. It begins with an overview of the final simulator prototype and the developed scenarios. This is followed by the evaluations and insights gathered through interviews with the HEMS coordinator and company advisor at Everdrone. The chapter concludes with a summary of the key outcomes, addressing the research question and assessing how effectively the simulator responds to the design challenge. 6.1 Simulator This section presents the finalized version of the simulator. It begins by outlining the supported build formats, followed by an overview of the GUI. Subsequently, the three developed scenarios are presented, and guidance is provided for modeling additional scenarios to support future development. 6.1.1 Format The simulator is highly versatile and can be deployed across multiple platforms, offering potential for future integrations, though such integrations fall outside the scope of this project. Developed using the Unity game engine, the simulator can be built for desktop environments (Windows, Mac, and Linux), dedicated servers, mobile platforms (iOS and Android), WebGL, tvOS, and visionOS. The primary objective of this project was to demonstrate to Everdrone how a DEMS simulator could be developed, showcasing its potential and capabilities, rather than integrating it directly into Everdrone’s DEMS Mission Control app. However, there are possibilities for future development, such as deploying the simulator as a standalone iOS application or exploring WebGL integration within DEMS Mission Control. The latter approach, however, may introduce performance limitations, as running the simulator in a web browser could be less efficient compared to a native iPad application. 45 6. Results 6.1.2 Overview The following chart in Figure 6.1 describes the flow throughout the simulator, showcasing the different scenes that can be navigated to. The main menu, shown in Figure 6.2 contains a list of the different scenarios. Figure 6.1: Navigation flow from the main menu to the available emergency scenarios. 46 6. Results Figure 6.2: The main menu in the simulator, showing a list of available scenarios. 6.1.3 Traffic Accident The traffic accident scenario depicts a country road where multiple cars have been involved in a vehicle accident, where one car has overturned off the road causing it to lay upside down. Smoke is comming from the car hull and there are trees surrounding the entire scene. An image of the scene is shown in Figure 6.3 and 6.4. Figure 6.3: Traffic accident scenario on a country road involving multiple vehicles, including an overturned car emitting smoke, surrounded by trees to enhance simula- tion realism. 47 6. Results Figure 6.4: Close-up of the overturned car in the traffic accident scenario. 6.1.4 Building Fire The building fire scene depicts a smaller industy area containing one office facility and two adjacent storage facilities. A fire has started to evolve in one of the storage facilities and smoke is spreading out from the rooftop vents. There are multiple vehicles around and flamable objects could be contained in the storage areas which may develop the fire drastically, spreading out over the woods. Employees have evacuated the facilities and are watching the fire from afar. As the scenario takes place during the night, the sight is limited and identifying people is difficult using the color camera. Thus, understanding how the infrared camera can be utilized is crucial for assessing the image details. The building fire scenario is shown in Figure 6.5. 48 6. Results Figure 6.5: Building fire scenario in a small industrial area at night, with smoke emanating from rooftop vents; employees have evacuated and are observing from a distance. Infrared imaging is utilized to assess the scene due to limited nighttime visibility. 6.1.5 AED Delivery The AED delivery scenario depicts a resident area and the AED will be delivered outside one of the houses. The scene shows how the drone flies towards the drop point and how the camera later tilts to capture the release of the rope which the AED is hooked to. After the AED has been delivered, a bystander comes out to get the AED and later the drone flies away. The scenario can be seen in Figure 6.6 and 6.7. Figure 6.6: AED delivery scenario depicting the drone’s approach to a residential area. 49 6. Results Figure 6.7: AED delivery scenario showing the drone at the drop point releasing the AED. 6.1.6 LiveView Controls The simulator integrates all functionalities available in Everdrone’s LiveView system. Even though zoom controls are not currently supported in the LiveView system, they have been implemented within the simulator to demonstrate zooming capabilities. In the top-left corner of the interface, a bird’s-eye view map simulates the Google Maps satellite imagery used in LiveView. Within this map, a blue circle represents the drone’s current position, while the emergency point—the designated location to which the drone is navigating—is marked with a yellow warning symbol. The bottom-left corner of the interface displays two key informational labels: the estimated time of arrival (ETA) of the drone and the address of its current location. At the bottom center of the screen, users can access controls to pause the drone’s movement, allowing for updates to the emergency point or adjustments to the drone’s orbital trajectory. The left button enables clockwise rotation, whereas the right button facilitates counterclockwise rotation. These controls are available only when the emergency does not involve an Automated External Defibrillator (AED) delivery and when the drone has reached a predefined distance from the emergency location to commence its orbit. Modifications to the orbital trajectory can only be performed when the drone is in a paused state. While paused, users can update the emergency point by selecting a new location either on the mini-map in the top-left corner or within the camera view. In the bottom-right corner, operators can toggle between three camera modes: Color, IR, and IR + Color, which displays both IR and Color camera feeds side by side. Above these camera controls, on the middle-right side of the screen, zoom controls are available. The button that returns the user to the menu is located in the top-right corner, positioned above the zoom controls. 50 6. Results 6.1.7 Modeling New Scenarios For modeling new scenarios, a scene template in Unity has been established, illustrated in Figure 6.8. The template creates a new scene with a sample terrain, trees, grass and a drone object. Since the UI, drone, drone movement, cameras and IR management are encapusalted into its own prefab, the drone can be updated in capsulation and the changes will be visible in all scenes. This also declutters new scenario modeling as efforts can be focused on modeling the scenario rather than connecting it to the drone and IR managers. Figure 6.8: Scene template established for rapid modeling of new scenarios, including sample terrain, vegetation, lighting, a person, a car, and a drone prefab. The drone prefab encompasses components such as cameras, UI, flight patterns, and IR management with a replacement shader for an IR appearance. 51 6. Results 6.2 Evaluation This section presents the findings derived from the evaluation phase, as outlined in section 4.5. It begins with the interview conducted with the HEMS coordinator. This is followed by the interview with the company advisor at Everdrone. 6.2.1 Interview with the HEMS Coordinator A semi-structured interview was conducted with an HEMS coordinator, Albin Tjäder [26]. The interview covered areas about the HEMS coordinator role, demonstration of the simulator, review of the chosen scenario designs, realism, and strategic planning using the simulator. The insights gathered are presented in this section. 6.2.1.1 The HEMS Coordinator Role The biggest challenge that Tjäder faces in his role is to manage multiple processes simultaneously. The processes are usually incoming alarms from the alarm center or assistance from other actors in neighboring counties, such as the police, ambulance, and rescue service. A significant amount of effort goes into determining whether or not to deploy a helicopter for a given situation. The Västra Götaland region covers an area of 23 945 kilometers, meanwhile VGR only possesses one helicopter, and it is an expensive resource to utilize. Since most emergencies could be in need of a helicopter, Tjäder and his collegues need to continously dismiss cases where the value of using the helicopter does not produce a significant outcome. The time for making these priotizations and giving a decision has to be short depending on the nature of the emergency. The decisions are made remotely, away from the emergency, often depending on the severity of the medical conditions. Tjäder has previously been introduced to the DEMS Mission Control app, including the LiveView features for monitoring DEMS emergencies that include AED deliveries. He has not used LiveView in a real emergency, but believes that LiveView can support decision making for coordinators in situations where they only rely on oral directives from the caller. Tjäder believes that the complement of a video stream from the emergency location can help them establish more accurate assessments. Although Tjäder is optimistic about the use of LiveView, he pointed out two limita- tions with the current state of the system. One limitation is the zooming restriction by VGR due to privacy policies. Zooming can help HEMS coordinators make more accurate assessments, and another limitation is that it is not possible to share the video stream between various actors, either in real-time or afterward. Tjäder emphasized the importance of familiarizing oneself with the LiveView system in a test environment before a real-world operation. For a new HEMS coordinator, this is particularly crucial, as initial use of the system can divert attention from situational and medical assessment, instead requiring focus on learning the controls and effectively utilizing its features. Important skills for HEMS coordinators to 52 6. Results practice are image assessment, identifying people and surroundings, as well as other significant aspects that are associated with the emergency. Tjäder also underscored the importance of practicing these skills in a safe test environment so that one is prepared for a live emergency rather than using real life decisive situations for practicing. 6.2.1.2 Evaluation of the Simulator After the simulator was demonstrated, Tjäder believed that the greatest value of the simulator lies in learning the controls. Additionally, he believed that the scenes chosen to be simulated replicate real common scenarios and are useful for HEMS coordinators to practice in advance of live situations. Tjäder believes that if the aim is to practice image assessment, the simulator may have one limitation compared to prerecorded video material. The demonstrated scenarios did include a variety of objects such as smoke, people, and vehicles but they where mostly static. Tjäder pointed out that assessing and identifying people and their behavior in ways that differ from a natural state are the most critical aspects of practicing image assessment. He questioned whether video recordings including real actors may better depict certain human behaviors but admitted that 3D modeling may be faster and less resource demanding for simulating scenarios. 6.2.1.3 Evaluation of Prior Theory Tjäder got to evaluate the prior theory about critical aspects and components in the simulated emergency types. This was necessary to confirm the theoretical foundation for this thesis and to identify essential skills that need practice to strengthen SA, thus being key factors in the design to facilitate early SA. 6.2.1.4 Traffic Accident Tjäder confirmed the important factors for traffic accidents based on the gathered prior knowledge. He admitted that it is important to have a bird-eye perspective on the area, be able to zoom in on details, analyze the video recordings afterwards, and utilize an IR camera to identify hidden victims and objects, especially in bad sight. 6.2.1.5 Building Fire Tjäder admitted that it is essential to be able to quickly assess the scale of the fire, the risk of fire spread, resource allocation, potential rally point where units can be deployed, entry points, blockages, and passability. The IR camera is also believed to be useful in identifying victims and explosive or flammable objects. However, most of these factors are not for the HEMS coordinator to decide as their role is to focus on identifying people, medical conditions, and emergency type to allocate resources respectively. Tjäder also emphasized the need to factor in passability when deciding rally points, as units have to avoid blocking passages for other units to pass through. This location can also be separated from the rescue point where vulnerable victims are transported for medical support. 53 6. Results 6.2.1.6 AED Delivery Tjäder acknowledges that there is a challenge to decide the drop point for the AED and guiding personnel or bystanders. However, his obligation is only to inform the emergency dispatcher about the status of the AED delivery, since the dispatcher communicates with the caller and notifies when and where the AED is delivered. Furthermore, it is the drone pilot’s obligation to update the drop point of the AED. Tjäder does not believe that there is a challenge in locating the AED once it has been delivered. Today, the HEMS coordinators receive a still image of the location where the drone has been delivered, and Tjäder thinks that this provides a sufficient amount of data. Thus, he does not think that a video feed instead of a still image adds any significant value. However, if the location is unknown to the caller or personnel, a real-time overview of the scene could add value in terms of guidance. When questioned how the drone should act after the AED have been delivered, Tjäder argued that it normally would not give him any value if the drone stays at the drop point after the AED have been dropped. It could be more valuable if the incident is taking place outside, but it is still not necessary since the ambulance always arrives shortly thereafter according to protocol. Tjäder clarifies that the main purpose of the drone is to deliver an AED for the defibrillation to begin as soon as possible. He believes that it is best for the drone to return after the mission is complete to be prepared for new emergencies. 6.2.1.7 Realism According to Tjäder, having realistic visuals in the simulator is not necessary for HEMS coordinators to learn how to control the drone. However, realism can play an important role for practicing image assessment, including identifying people and odd behavior. Today, visual feedback is the only sensory input that provides telepresence in LiveView. Sounds from the drone and environment are not present, but could theoretically increase immersion and the feeling of being involved in the action but remotely. Tjäder had a more feature-oriented perspective at this matter and did not find any value of having sound from the drone but speculated that further feedback, such as distance assessment, could be useful for assessing passability for emergency dispatch. 6.2.1.8 Strategic Planning Currently, HEMS coordinators communicate primarily with medical duty officers using a intercom system called RAKEL, but also with others within the police or the Swedish Sea Rescue Society. Communication is necessary to foster a shared operational overview which is essential for all included parties. Tjäder speculated about future functionalities within LiveView for sharing data with duty officers to better prepare for major actions such as initiating redistribution of patients at hospitals. Everdrone believes that the simulator may be used by various actors to learn how to collaboratively strategize and coordinate larger emergencies 54 6. Results and crises. Tjäder shared this belief and that the simulator can be especially useful for command patrols. He noted that they currently perform tabletop exercises where they simulate emergencies on either a table or whiteboard. The progression of the event is managed by the exercise leader who introduces events that take the emergency in various directions. Because of the familiarity with simulated practicing of strategic planning, there is a natural transition towa