AE

Abdallah El El Ali

30 records found

Virtual coaches in virtual reality (VR) offer scalable mental health treatment without an on-site therapist, yet their impact on psychophysiological responses remains unclear. We examine how VR content and coach design influence physiological measures, such as heart rate (HR) and ...

Dark Haptics

Exploring Manipulative Haptic Design in Mobile User Interfaces

Mobile user interfaces abundantly feature so-called ‘dark patterns’. These deceptive design practices manipulate users’ decision making to profit online service providers. While past research on dark patterns mainly focus on visual design, other sensory modalities such as audio a ...
Encounters with virtual agents currently lack the haptic viscerality of human contact. While digital biosignal communication can mediate such virtual social interactions, how artificial haptic biosignals influence users’ personal space during Virtual Reality (VR) experiences is u ...

Transparent AI Disclosure Obligations

Who, What, When, Where, Why, How

Advances in Generative Artificial Intelligence (AI) are resulting in AI-generated media output that is (nearly) indistinguishable from human-created content. This can drastically impact users and the media sector, especially given global risks of misinformation. While the current ...

ShareYourReality

Investigating Haptic Feedback and Agency in Virtual Avatar Co-embodiment

Virtual co-embodiment enables two users to share a single avatar in Virtual Reality (VR). During such experiences, the illusion of shared motion control can break during joint-action activities, highlighting the need for position-aware feedback mechanisms. Drawing on the perceptu ...
Within our Distributed and Interactive Systems research group, we focus on affective haptics, where we design and develop systems that can enhance human emotional states through the sense of touch. Such artificial haptic sensations can potentially augment and enhance our mind, bo ...

BreatheWithMe

Exploring Visual and Vibrotactile Displays for Social Breath Awareness during Colocated, Collaborative Tasks

Sharing breathing signals has the capacity to provide insights into hidden experiences and enhance interpersonal communication. However, it remains unclear how the modality of breath signals (visual, haptic) is socially interpreted during collaborative tasks. In this mixed-method ...

Reflecting on Hybrid Events

Learning from a Year of Hybrid Experiences

The COVID-19 pandemic led to a sudden shift to virtual work and events, with the last two years enabling an appropriated and rather simulated togetherness - the hybrid mode. As we return to in-person events, it is important to reflect on not only what we learned about technologie ...

Affective Driver-Pedestrian Interaction

Exploring Driver Affective Responses toward Pedestrian Crossing Actions using Camera and Physiological Sensors

Eliciting and capturing drivers' affective responses in a realistic outdoor setting with pedestrians poses a challenge when designing in-vehicle, empathic interfaces. To address this, we designed a controlled, outdoor car driving circuit where drivers (N=27) drove and encountered ...
Instead of predicting just one emotion for one activity (e.g., video watching), fine-grained emotion recognition enables more temporally precise recognition. Previous works on fine-grained emotion recognition require segment-by-segment, fine-grained emotion labels to train the re ...
Measuring interoception ('perceiving internal bodily states') has diagnostic and wellbeing implications. Since heartbeats are distinct and frequent, various methods aim at measuring cardiac interoceptive accuracy (CIAcc). However, the role of exteroceptive modalities for represen ...

FeelTheNews

Augmenting Affective Perceptions of News Videos with Thermal and Vibrotactile Stimulation

Emotion plays a key role in the emerging wave of immersive, multi-sensory audience news engagement experiences. Since emotions can be triggered by somatosensory feedback, in this work we explore how augmenting news video watching with haptics can influence affective perceptions o ...

From Video to Hybrid Simulator

Exploring Affective Responses toward Non-Verbal Pedestrian Crossing Actions Using Camera and Physiological Sensors

Capturing drivers’ affective responses given driving context and driver-pedestrian interactions remains a challenge for designing in-vehicle, empathic interfaces. To address this, we conducted two lab-based studies using camera and physiological sensors. Our first study collected ...
Voice User Interfaces (VUIs) such as Alexa and Google Home that use human-like design cues are an increasingly popular means for accessing news. Self-disclosure in particular may be used to build relationships of trust with users who may reveal intimate details about themselves. ...
Fine-grained emotion recognition can model the temporal dynamics of emotions, which is more precise than predicting one emotion retrospectively for an activity (e.g., video clip watching). Previous works require large amounts of continuously annotated data to train an accurate re ...

Towards socialVR

Evaluating a novel technology for watching videos together

Social VR enables people to interact over distance with others in real-time. It allows remote people, typically represented as avatars, to communicate and perform activities together in a shared virtual environment, extending the capabilities of traditional social platforms like ...
Visualizing biosignals can be important for social Virtual Reality (VR), where avatar non-verbal cues are missing. While several biosignal representations exist, designing effective visualizations and understanding user perceptions within social VR entertainment remains unclear. ...
Watching HMD-based 360° video has become in-creasing popular as a medium for immersive viewing of photo-realistic content. To evaluate subjective video quality, researchers typically prompt users to provide an overall Quality of Experience (QoE) score after viewing a stimulus. Ho ...
Automatically inferring drivers' emotions during driver-pedestrian interactions to improve road safety remains a challenge for designing in-vehicle, empathic interfaces. To that end, we carried out a lab-based study using a combination of camera and physiological sensors. We coll ...
While affective non-verbal communication between pedestrians and drivers has been shown to improve on-road safety and driving experiences, it remains a challenge to design driver assistance systems that can automatically capture these affective cues. In this early work, we identi ...