If you’re interested in Quantified Self, own an Android smartphone and were just as disappointed as I was that the MoodScope app is not publicly available yet, you should definitely add the ‘EmotionSense’ app to your self-tracking toolbox.
Emotion Sense is an Android application developed by researchers at the University of Cambridge, and although it does not do mood predictions, it does allow you to explore how your mood relates to the sensor data your smartphone captures as you carry it throughout the day; it puts your many moods ‘into context’.
Even smartphones notice the circumstances that affect our moods
Earlier research by the Emotion Sense team, which in addition to self-reports also included voice analysis against the Emotional Prosody Speech and Transcripts Library, threw up some interesting suggestions about how circumstances may affect our emotional condition. Find the full paper, EmotionSense: A Mobile Phones based Adaptive Platform for Experimental Social Psychology Research, here.
Location appeared to have a pronounced effect on the users’ state of mind. “Happy” emotions dominated the data when they were in residential locations (45% of all emotions recorded), but in workplaces “sad” emotions became the norm (54%). The researchers also found that users exhibited more intense emotions in the evening than in the morning and that people tended to express their emotions far more in smaller groups than in larger crowds.
Curious if this goes for you too? Just download the EmotionSense app from Google Play. Using the tracking app is fairly straightforward. A few times a day, Emotion Sense will actively enquire about your mood. You simply plot how you’re feeling on the Affect Grid – very much alike to MoodScope’s circumplex mood model. The horizontal axis represents valence (are you feeling negative or positive), and the Y-axis represents your level of activeness. After inputting how you currently feel, you also get a short survey. That’s a quick two questions when prompted for mood-input by the app, and ten if you spontaneously decide to tell Emotion Sense all about how you’re feeling.
As you continue to use Emotion Sense, you’ll unlock different comparison views between your moods and the smartphone’s sensor data. You start out with time of day (morning, afternoon, evening, night). Next up there’s your location (home, work, ..), then your SMS patterns, followed by the phone’s accelerometer (which can be used to detect your activity), its phone screen (brightness of your environment?*), its microphone (voice analyses, or environmental sound levels?*) and call patterns.
Your feelings, your data
By using Emotion Sense, you do agree that your data will be used to continue research on how our moods relate to our smartphone usage patterns and the environment as recorded by the its sensors. However, that personal data will be anonymised and aggregated before being published, so nobody needs to know you have a bit of a temper in the mornings too.
Although there’s no automated way to export your moods and related context data yet, the EmotionSense team promises to provide you with all the raw data collected from your phone if you contact them. This means nothing is stopping you from combining EmotionSense’s mood & sensor data with other data sources such as the weather, your workouts, your daily coffee consumption, … in a search for correlations.
Build your own
EmotionSense is good news not only for quantified selfers, but for developers too. The code which is used in Emotion Sense to collect sensor data from people’s phones is being made available on an open-source basis so that other researchers can conduct their own experiments and build their own apps.
Technology-assisted Introspection and Reflection
Personally, I do not believe in ‘pocket therapy’. Yet after a few days of use, and despite having only unlocked ‘time of day’ as context so far, I have no doubt Emotion Sense is able to give you some additional insights into your own feelings. I, for instance, seem on average even more grumpy than I thought in the mornings, and despite the traffic jams, my evening commute makes me more relaxed. Actively thinking about how I feel has been a pleasant experience in itself. There’s definitely something to be said for introducing more introspection and reflection into our lives again, and technology can assists us into that.
And if one day technology such as EmotionSense would be able to automatically detect, or even predict our moods, would that not be great? Third party apps could positively act on that knowledge.
Jini does now already cheer me up when after only a few hours of sleep she wishes me a good morning and makes a tongue-in-cheek remark about just how terrible Mondays are. Even though I know quite well what she’ll be saying under those circumstances, as I wrote it. I’m not alone at that, some of the most enthusiastic responses we’ve gotten so far are regarding those ‘impromptu’ observations on a person’s behaviour, leading me to believe we could all do with some assistance, and prompts, as it comes to reflection. I’m quite sure I could live with – and would even embrace – a trusted app pointing out to me in a lighthearted way that I’ve been a tad cross for the last few hours, but that that will probably pass by noon.
Furthermore, on a much larger scale, automated emotion detection could not only inform brands about what makes us happy and when it is the right time to talk to us (or, more important to me, when they should not interrupt) but also give city officials a view into how their residents perceive a city.
For example, they could compare aggregated ‘mood’ at different trains stations, and – after factoring in other data, of course – use that mass-sentiment data to see which ones need improvements first. A recreation area that has everybody leaving just a bit more content might have done something worth looking into, … .
Install it already!
So please do install EmotionSense, and once awhile tell it how you feel. You’ll be doing the Cambridge team of scientists a big favour, and me too, as I do look forward to my phone getting to know me better, as to serve me better.
* I’m not quite sure about these, as I haven’t unlocked those ‘reporting levels’ yet. Phone screen could mean the amount of time you’re actively engaging with your smartphone, as the screen is on, but it could also measure the light levels of your environment as this sensor was originally put into our smartphones to facilitate the automatic adjustment of screen brightness. The same goes for the microphone, which could be used to do sentiment analysis on your calls, or to check the background noise, which is normally used to optimize our calls.
EmotionSense is available since May 2013, and if you started earlier and have access to those screens already, I’d love to see them. Just tweet a screenshot @vintfalken, or (if you’re more privacy inclined) share them with me by mentioning me on Google+.