Project Overview



Inside Out



Inside Out is an embodied interaction device that transforms hand gestures into musical expressions. We want to explore a new way for remote connection building, allowing them to share and understand each other's moods beyond words.Duration
12 Weeks, Spring 2024

Course
Embodied Interaction Design Studio
# Tangible and Embodied Interaction
# Research Through Design
# Arduino Prototyping
# Human-computer Interaction
# Machine Learning




In today's hyper-connected yet emotionally distant world, we explored how tangible interaction could help people communicate emotions with loved ones across distances. “Inside Out” is a gesture-based musical device designed to express and receive moods without words. This project combines emotional design, collaborative activity, and music interaction to propose a new way of staying close—emotionally—when physically apart.




User Research

Research QuestionHow can me map the relationship between Mood x Gesture x Melodies?



Research MethodsMethod 01: Questionnaire
(quantitative data)

The questionnaire helps us collect various demographic information on the users and understand their relationship between moods and music.

It also enables us to identify:
– which moods users commonly experience in daily life
– which musical parameters most influence users' emotional states

       Method 02: Participant Observation
+ Exploratory Research approach (qualitative data)
+ Music Database(quantitative data)

We aimed to understand:
– how mood  can be translated into visible and audible forms
– through which actions users naturally communicate emotions
– how different types of gestures correlate with emotional states (gesture–mood coupling)

By observing participants’ spontaneous interactions, we gained insights into embodied expressions of emotion and how users externalize internal moods through music-related actions.


User Responses

1 How mood can be translated into something visible and audible




2 Which intuitive actions users naturally communicate moods





Physical Interface Exploration



Physical Interface ConceptOur device consists of two main components: a handheld controller and a base unit.
The controller is held by the user and is responsible for detecting physical gestures and movements.
The base serves as the central unit for receiving and transmitting sound, while also housing essential internal hardware components.



Two Proposals

1/Classic telephone

We drew inspiration from the classic telephone. This device, like our concept, facilitates one-on-one emotional communication, supports long-distance connections, and serves as a tool for emotional exchange.
2/Gift box

In this concept, we draw a metaphor to a letterbox. The expression of emotions from the other party is likened to a letter. The processes of receiving and sending letters symbolize the emotional exchange. The act of opening the box carries a sense of anticipation and curiosity, akin to the excitement of discovering what’s inside a letterbox.
Letters and gifts serve as emotional connections across physical distances, which aligns perfectly with our primary theme.

Conclusion:
 
❌ Difficult to solve technical problems
Conclusion:

✅ More intuitive through the product modeling
✅ Easier to achieve from the technical level




User Test For Physical Interaction


InsightsBased on the results of our physical interaction user test, we decided to move forward with the gift box proposal and abandon the classic cellphone metaphor.
We also redesigned the shape of the controller within the gift box concept, as users’ intuitive interactions with the original form did not align with its intended functionality.




Physical Interface



The device consists of two main components: a handheld controller and a base unit.

The controller is held by the user and detects hand gestures and hand movement using a pressure sensor and the built-in IMU of the Arduino Nano. It is connected to the base via a wired connection.

The base unit serves as the central system for processing and feedback. It integrates components such as the RFID tag, photoresistor, RGB ring LED, and speaker, enabling a multi-sensory interactive experience.
The RFID tag provides signals for detecting the placement of the controller.

On the software side, we trained a machine learning model to map user actions to commonly experienced emotional states and generate corresponding music feedback in real time.



Prototype & Technical Realization 





The cross-sectional view of the controller and base unit
Gripping part, made by polymorth
Process of making a silicone model



Prototype & Technical Realization




Music Part


We made several iterations to our music system.
In one user test, participants reported that the melodies were too short and lacked clear emotional cues, making them difficult to distinguish. Based on this feedback, we made improvements.

We extended the length and increased the complexity of the melodies, while still keeping them simple and easy to remember.
We also aligned the rhythm of the melody more closely with the user’s hand movement frequency to enhance the sense of embodiment.

In addition, for each emotional track, we experimented with different instrument timbres to reflect the textures that users had visualized during earlier sessions.
Through iterative development and user feedback, we refined our approach to creating emotionally evocative melodies.


Final Interaction Mapping


Gesture
Melody
Mood Positive
Instrument: Marimba
Tempo: 120
   Calmness
   Instrument: Woodwind
   Tempo: 80
    Nervousness
    Instrument: Synth
    Tempo: 120
     Anger
    Instrument: Synth
    Tempo: 120




Interaction Flow

RECEIVING MODE

/1 Notice the signals
The box will light up to inform you of the new message received.
/2 Open the box
Open it to start to play the sounds produced by another user.
/3 Listen to other’s sounds
The sounds will be automatically played until it’s finished.
/4 Close the box
Close the box to stop the sounds playing and end the process.




SENDING MODE

/1 Open the box
You will see a controller that will help you express your mood.
/2 Take the controller
Pick up the controller and express your mood by moving it.
/3 Express your moods
The box will start to play sounds in real-time according to your different movements related to moods.
/4 Put back the controller
Put back the controller, close the box to send your mood to another user living far away.



Gallery 





Video 




Recognition

Team Member

Interdependence x Milano Design Week, Milan, ItalyAndrea Borsato
Lisa Buttaroni

Kaiyuan Liu
Zixin Mou
Chunhan Yi