Effects of Cognitive Load in Human-AI Requirements Engineering
Ladda ner
Publicerad
Typ
Examensarbete för masterexamen
Master's Thesis
Master's Thesis
Modellbyggare
Tidskriftstitel
ISSN
Volymtitel
Utgivare
Sammanfattning
As Artificial Intelligence becomes more integrated into software engineering, its role in decision-support systems within Requirements Engineering has grown. However, the cognitive demands placed on users interacting with these AI tools remain underexplored. This thesis investigates how explanation formats offered by Explainable AI affect mental effort, task difficulty, confidence, and correctness during requirements engineering inspired prioritization tasks. Through a controlled experiment with 61 participants, three XAI formats of bar charts, textual explanations, and confidence scores were evaluated across two task pairs of differing complexity. The study examined the influence of task complexity and explanation format, the impact of explanation type on decision-making quality, and whether participant preferences for certain formats aligned with improved performance and lower cognitive strain. Statistical analyses, including Spearman correlation and independent t-tests, revealed that task complexity consistently influenced cognitive load, while explanation format had no clear effect. Additionally, although preferred formats
did not universally enhance task performance, participants who favored confidence scores showed marginally higher correctness and confidence levels. These findings suggest that cognitive effort in AI-assisted requirements engineering tasks is shaped more by task characteristics than explanation format alone, and that tailoring explanations to individual user preferences may offer subtle benefits.
Beskrivning
Ämne/nyckelord
Requirement Engineering(RE), Cognitive Load(CL), Artificial Intelligence (AI), Explainable Artificial Intelligence (XAI), Weighted Shortest Job First (WSJF), Research Question (RQ), User Experience (UX)
