Project Overview
Immersive Emotions is an interactive public installation that helps users understand their emotions and how their body language might give insight into how they are feeling. Using biosensors embedded in a glove, the data from these sensors is mapped to a display.
This project was accepted into the ACM Mobile Human-Computer Interaction Conference 2022.
You can access the paper here.
Key Features
- Wearable Device
- Bio Sensors
- Camera
- Visual Display
Context
The idea for this project was sparked out of the desire to create an ambient technology that helps individuals express their emotions. As a group of five grad students with all very different backgrounds, we bonded over technology, design, and finding a way to move our bodies during stressful periods in our design engineering master's program.
Narrative:
A user is stressed and wants to blow off steam. They go to their student center and locate the Immersive Emotions installation. They put on the wearable device and start moving. As they move and allow their body to flow in a way that's calling them, they see a visual on a screen in front of them that mirrors themselves in an abstract way, giving insight into their internal and external physical, mental and emotional state.
The user is now done. They remove the glove and head back to their studio, feeling lighthearted and ready to tinker in the makerspace.
Research
There are approximately twenty-seven human emotions that are all interconnected; however, there are six that are most easily identifiable. These six basic emotions are happiness, sadness, fear, anger, disgust, and surprise. Using these six emotions, we investigated how they related to neuroaesthetics and body language. Additionally, we researched how the human body responds physically to these emotions through biometric data.

Data Capture
To capture biometric data, we used a heart rate sensor, a galvanic skin response (GSR) sensor, an accelerometer, a gyroscope and a temperature sensor. These sensors were all embedded into the wearable device that goes onto the hand. Additionally, we used a camera to capture body positioning.
The data from the sensors is then sent to a software via bluetooth that analyzes the data and outputs the visual based on all the data of the given user.
Wearable Device
The wearable glove was made using neoprene to hold the sensors, while also ensuring sensors that relied on skin conductivity were secured on the skin of the fingertips.

Data Visualization
The visual that is displayed based on the movement of the user is abstract. It is meant to provoke reflection and curiosity within the user. Particle size, their path, their color, opacity, and the overall area where the visualization occurs are indirect reflections of the body metric values, gestures, and posture.
