Sky Tien-Yun Huang
(2014-2015 in collaboration with the MIT Media Lab) Role: Research, UI, UX, Prototyping, Testing

The Moment -Phase 2 (as of May 2014)

The goal of the second design phase was to reconsider the whole system, including the interface for self-reporting, the visualization of the data, and how to implement the social support feature. To do this, I did a whole new run of research and testing on color and emotion. I also consulted with clinical psychologists to learn more about psychotherapy; and participated in the “Tools for Well-Being” class at MIT Media Lab to learn more about the factors involved in well-being and to get feedback from both the scientific and psychological perspectives. I then did a paperprototye for this design phase.

Rethinking the System

At the beginning of this second design phase, a new class called “Tools for Well- Being” was offered at the MIT Media Lab, co-instructed by Professor Pattie Maes, founder and director of the Fluid Interface Group at the MIT Media Lab, Professor Rosalind Picard, and Tinsley Galyean, Program Co-director of Young Peace Leaders at the Dalai Lama Center of Ethics at MIT. This class raised many issues that are related to well-being that I had not considered in the first design phase, including the effects of food, exercise, and mediation among others.

Meanwhile, I also consulted with some psychologists, psychiatrists and psychotherapists. Among them, Chloe Mun-Yee Kuan, a clinical psychologist
at North Dakota University, and Chen-Ying Wu, MD, MPH, a psychiatrist at Maimonides Medical Center, New York City, provided important information when I was redesigning the whole system. Chloe shared her experience observing her patients’ appetite and its relationship to depression. She also introduced me to the field of current psychotherapies so that I know where I can find more information for designing effective interventions. Chen-Yin shared her clinical experience about bipolar episodes and their triggers. With their help, I was able to define a few important features in the app, such as:

  • What should be recorded and how often the recording should happen in order to get meaningful and helpful visualization,

  • What should be visualized (what could be helpful when being visualized) to reveal possible patterns,

  • What should be included in the “therapist view” to facilitate the counseling process.


Experiment on Color & Emotion



Early Sketches



Paperprototype

A) Redesigning the Self-Reporting Interface

During the last user test, people mentioned that color spectrum was a bit overwhelming at first glance. Thus, I tried to simplify it. The first solution I tried was to divide the spectrum into six categories as Ryberk listed. However, the description Ryberg gave the colors did not fit in this project because it was too complicated. I then tried to find some more research and was attracted by Paul Ekman's research. Paul Ekman proposed that certain emotions appeared to be universally recognized, even in cultures that were preliterate and could not have learned associations for facial expressions through media. He classified six emotions as basic. These are anger, disgust, fear, happiness, sadness and surprise. As this scale of six emotions has been widely adopted, I decided to apply these six emotions to the color picker and came up with the first design. The picker first appears with a circle divided by six colors, accompanied by small text describing each emotion that represents each color. When the user taps a color, the tapped section will expand to 3 sections with 3 shades of the color, for the user to define how strong her feeling is. The lighter the color is, the lighter the feeling is.


This color wheel was changed from six to eight categories to align with Plutchik's Flower. As in the previous version, when the user taps a color, she will be presented with three shades of that color to describe the intensity of her feeling. The interaction here did not change from the first phase. The user is led to "Detail" screen after choosing a color (#4). Three minor changes here are: 1.The user can now attach a video in addition to a still image. 2.The user can now describe who she is with at the moment. 3.The user can now see the location data and name the place she is at. These features were designed to further engage the user and to add more diversity to the intervention library.

Sleep reporting was very simple in the first version because we relied on the sensor to collect all sleep data. Without the sensor, the sleep reporting now became more important and needed more detail. To create a sleep log, the user taps the plus button and moves her finger to "sleep." (#1) The app offers two ways for the user to report her sleep: 1.She can tap on the "Go to sleep" button (#2) before she goes to bed, another window would pop up with an "I'm awake" button (#3). She can then go to bed. After she wakes up, she would tap "I'm awake," she would then be led to another screen where the time she went to bed/got up were auto-populated, and she could rate her sleep quality by filling the battery bar (#4). 2.If the user forgot to tell the app when she went to bed, she can use "add sleep record" to manually input a time and rate the quality of sleep (#5). After either way of input, the user can then tap "save" to store the log.

B) Redesigning the Visualization of the Data

The "Timeline" feature is changed to "Explore," as it is designed to help the user "explore" her emotional pattern. To do this, this feature offers 3 ways for the user to "explore" her logs. 1.Because the uneven bars in the previous version were easily misunderstood as a bar chart. I changed them to pin-like dots (#1) as each of them represents a "check- in" on the timeline. But I then realized the variation of height were unnecessary because the saturation of the color already indicates the intensity of the emotion. Thus I decided to bring the bars back but at same height (#2). Same as earlier version, the user can tap on each bar to see detail information (# 3). 2.To provide the user with another way to see the pattern, I explored different kinds of visualization with different information related to mood fluctuation, such as weather, location, work schedule, etc. I created a filter for the user to sort out logs with specific detail. For example, the user can choose to see only happy moments (# 4, #5) or to see how she usually felt at a specific location.

C) Modifying and Implementing Social Support Feature

The social support feature was not implemented in the previous section because of time limitation. In this design phase, I looked into the technical parts with my developer Yuwei. After assessing the development effort, we concluded that Facebook API might be the easiest way at this moment. This did not mean the user's data would be shared on Facebook. Instead, the user would be able to invite someone who is on Facebook and also has this app installed in his phone. Once the friend accepted the invitation, they would be able to see each other's most recent mood logs on the "friends" screen (#8). Meanwhile, the user can tap "hug" at the right of the logs and give her friend a hug if she finds a friend has been reporting dark colors recently. A user who receives a hug from a friend would get notifiied on his phone.

D) Adding "Sync with Therapists" Feature

I designed this tool to help a user better understand herself and better live with her emotions. It should not replace traditional therapy before it was proven to be effective without traditional therapy. So how can this tool work alongside the existing therapy? After interviewing the medical professionals about their working environment and needs, I decided to add this feature. To use this feature, the user sends an invitation to her therapist, who would then receive an email with a link to view the user's self-reporting logs. To accodomdate therapists' work habbits, a website, not a mobile app, is designed for them to review their clients' status.


Prototyping, Testing and Things Learned


The interface usability test was done with 15 participants with/without bipolar/depression. A small conclusion from the tests includes:

  • Everyone has a different way of describing his/her feelings, which makes the self-reporting interface very a big challenge. It is hard to find one way that fits everyone's scale. For example, the color scale has been evolved 5 versions, although it looks less and less users got confused with the color picking part, there are still users took a relative long time to figure out how it works. Also on the appetite part, everyone describe his/her appetite differently, not just with good/bad. One of a bipolar user said she would prefer the app asks her what kind of food she is craving for right now because that usually correlates to her current mood.
  • To understand individual patterns about what can trigger bad or good moods, data from a longer period of time is required. A big challenge of this app is how to engage the user for as long as possible.
  • though The Internet of Things is becoming more and more feasible, there are still many problems connecting different sensors/systems. It requires strong back-end technology that I did not expect at the beginning.

Special thanks to Rosalind W. Picard, Brian Lucid, Yu-Wei Chang, Akane Sano, Chloe Mun Yee Kwan, Javier Hernandez, Chen-Yin Wu and Anne Welch.