Inside Out
12 Weeks, Spring 2024
Course
Embodied Interaction Design Studio
# Research Through Design
# Arduino Prototyping
# Human-computer Interaction
# Machine Learning
User Research
(quantitative data)
The questionnaire helps us collect various demographic information on the users and understand their relationship between moods and music.
It also enables us to identify:
– which moods users commonly experience in daily life
– which musical parameters most influence users' emotional states
+ Exploratory Research approach (qualitative data)
+ Music Database(quantitative data)
We aimed to understand:
– how mood can be translated into visible and audible forms
– through which actions users naturally communicate emotions
– how different types of gestures correlate with emotional states (gesture–mood coupling)
By observing participants’ spontaneous interactions, we gained insights into embodied expressions of emotion and how users externalize internal moods through music-related actions.
User Responses
1 How mood can be translated into something visible and audible
2 Which intuitive actions users naturally communicate moods
Physical Interface Exploration
The controller is held by the user and is responsible for detecting physical gestures and movements.
The base serves as the central unit for receiving and transmitting sound, while also housing essential internal hardware components.
Two Proposals
We drew inspiration from the classic telephone. This device, like our concept, facilitates one-on-one emotional communication, supports long-distance connections, and serves as a tool for emotional exchange.
In this concept, we draw a metaphor to a letterbox. The expression of emotions from the other party is likened to a letter. The processes of receiving and sending letters symbolize the emotional exchange. The act of opening the box carries a sense of anticipation and curiosity, akin to the excitement of discovering what’s inside a letterbox.
Letters and gifts serve as emotional connections across physical distances, which aligns perfectly with our primary theme.
❌ Difficult to solve technical problems
✅ More intuitive through the product modeling
✅ Easier to achieve from the technical level
User Test For Physical Interaction
We also redesigned the shape of the controller within the gift box concept, as users’ intuitive interactions with the original form did not align with its intended functionality.
Physical Interface
The device consists of two main components: a handheld controller and a base unit.
The controller is held by the user and detects hand gestures and hand movement using a pressure sensor and the built-in IMU of the Arduino Nano. It is connected to the base via a wired connection.
The base unit serves as the central system for processing and feedback. It integrates components such as the RFID tag, photoresistor, RGB ring LED, and speaker, enabling a multi-sensory interactive experience.
The RFID tag provides signals for detecting the placement of the controller.
On the software side, we trained a machine learning model to map user actions to commonly experienced emotional states and generate corresponding music feedback in real time.
Prototype & Technical Realization
Prototype & Technical Realization
Music Part
In one user test, participants reported that the melodies were too short and lacked clear emotional cues, making them difficult to distinguish. Based on this feedback, we made improvements.
We extended the length and increased the complexity of the melodies, while still keeping them simple and easy to remember.
We also aligned the rhythm of the melody more closely with the user’s hand movement frequency to enhance the sense of embodiment.
In addition, for each emotional track, we experimented with different instrument timbres to reflect the textures that users had visualized during earlier sessions.
Through iterative development and user feedback, we refined our approach to creating emotionally evocative melodies.
Final Interaction Mapping
Instrument: Marimba
Tempo: 120
Instrument: Woodwind
Tempo: 80
Instrument: Synth
Tempo: 120
Instrument: Synth
Tempo: 120
Interaction Flow
RECEIVING MODE
The box will light up to inform you of the new message received.
Open it to start to play the sounds produced by another user.
The sounds will be automatically played until it’s finished.
Close the box to stop the sounds playing and end the process.
SENDING MODE
You will see a controller that will help you express your mood.
Pick up the controller and express your mood by moving it.
The box will start to play sounds in real-time according to your different movements related to moods.
Put back the controller, close the box to send your mood to another user living far away.
Gallery
Video
Recognition
Team Member
Lisa Buttaroni
Kaiyuan Liu
Zixin Mou
Chunhan Yi