Emotion-Aware Adaptive User Interfaces for Enhanced User Experience Using Multi-Modal Deep Learning
DOI:
https://doi.org/10.15849/ijasca.v18i1.31Keywords:
User Experience (UX), Emotion Recognition, Multimodal Deep Learning Fusion, Adaptive User Interface, User StudyAbstract
Emotion-conscious computing is decisive in the further development of human-centered digital interaction, but the direct role in improving the User Experience (UX) has not been studied in detail. This paper introduces an Emotion-Aware Adaptive User Interface (EAAUI) system, which uses multi-modal deep learning to enhance usability, engagement and cognitive efficiency by adapting to emotions in real-time. The suggested solution combines facial expressions, speech prosody, and physiological cues with the help of the hybrid deep learning structure that consists of Convolutional Neural Networks (CNNs), Bidirectional Long Short-Term Memory networks (BiLSTMs), and Transformer-based attention systems to provide strong emotion detection and fusion. Emotions sensed dynamically are the motivators of adaptive interface changes, such as simplifying layout, modulating colors, and providing personal feedback. The framework is tested on benchmark datasets (AffectNet, RAVDESS and DEAP) and controlled user study with 30 people. The experimental findings show that the emotion recognition performance is good with an accuracy of 89.6% and an F1-score of 0.88. According to the UX view, the adaptive interface made task completion (21% less) and error reduction (18% less) significantly faster, user engagement (26% more) higher, and the System Usability Scale (SUS) score was 82.5. The results validate the claim that the operational effect of emotion-aware adaptive interfaces is beneficial to user experience, as they provide viable implications to UX-based applications in the educational, healthcare, and intelligent interactive systems.