Infering Mood From Your Smartphone Routine
Detecting our human emotions is something computer systems are already capable of. Just as our colleagues and friends do, machines just have to pay close attention to our facial expressions and the pitch of our voice. Emotion in voice signals can give useful quality statistics for call centers, whilst schools are experimenting with facial recognition software to spot struggling students. However, as these technologies depend on sound recordings and video footage they capture only fragments of our day – when we are actually making a phone call, or sitting in front of our laptop and staring straight ahead. Whist ‘emotion snapshots’ can be interesting on their own, to get a proper idea of how someone is feeling, it might be more compelling to look at their mood.
Mood differs from emotion in that a person’s mood is a reaction to a cumulative sequence of events, rather than a spontaneous reaction to one event. Mood is felt less intensely and lasts longer than emotion, remaining the same for hours or even days. But how do you constantly and accurately judge and log your mood? Enter Microsoft’s MoodScope research into analysing smartphone usage patterns and a user’s communication history and turning these into a “mood sensor.”
Mood is an affective state that plays a significant role in our lives, influencing our behavior, driving social communication, and shifting our consumer preferences. But in the digital realm of mobile devices, there is a distinct lack of knowledge about mood unless manually provided. While devices have many sensors to understand the physical world around them, they are unable to develop insight about the object that matters the most: the user. We consider the implementation of a mood sensor as a vital next step in enhancing the context-awareness of mobile devices.
In ‘MoodScape: Building a Mood Sensor from Smartphone Usage Patterns’, researchers from the Rice University in Texas and Microsoft Research report that currently their general model estimates users’ daily mood averages with an accuracy of 66%, a number that gradually grows to 93% accuracy after a two-month personalized training period. An impressive correctness rate, especially if you consider the scientists are not hooking up electrodes to their user’s brain; it’s merely analysing smartphone usage for signs of ‘activeness’ and ‘pleasure’, which are then mapped on the circumplex mood model to translate into moods such as for instance ‘relaxed’ (low activeness, high pleasure), ‘bored’ (low activeness, low pleasure) and ‘excited’ (high activeness, high pleasure).
So what data generated by you (and your smartphone) is used to deduce your mood?
- Social Interactions – How often you contact your most popular contacts, in three categories: calls, email, and text messages
- Routine Activity Records – Patterns in your browser history, application usage and location history.
- Usage of your ten most frequently used applications, and how often you use applications, eg. Entertainment, Finance, Game, Office, Social, Travel, … .
- Browser history by domain (eg. www.flickr.com)
- How often you visit and return to a certain location
A Mood Inference Engine
MoodScoop’s ‘Mood Inference Engine’ consists of two components, one residing on the smartphone to collect usage logs and mood labels, and the second in the cloud, which is responsible for adjusting the mood model based on the smartphone data. This way, the phone can ‘record’ a mood for the user without having to talk to the cloud.
Processes for ‘mood detection’ are run every three hours, which allows for background processes to have only minimal battery drain. “Our measurements of the power consumption on the Android service reveal that we consume less than 400 mW during the logging and data processing. In total, over a day, the engine consumes approximately 3.5 milliWatt-hours, or less than 0.5% of a phone’s battery capacity,” the researchers write. A daily log is about 1MB, and preferably transferred to the cloud services at night over wifi.
Possible Applications for User Mood State
MoodScope’s Inference Engine could also exposes a simple API for developers to use, so they don’t need to ‘worry’ about machine learning or mood psychology. It would be possible to query a user’s current and past moods, and scores for pleasure and activeness will be returned for the requested state. Possible usage of this ‘mood’ data?
Mood logging & browsing – For those of you who love Quantified Self, a history of moods allows you to be more aware of mood changes, and to reflect what might be causing these. Opposed to manual inputs of mood, which is likely to be triggered by extremes, you’ll get a overview of all your moods. No doubt, you’ll gain some new insights which allow you to improve your mental health.
Mood sharing – “Ann’s feeling a tad upset right now, are you sure you want to mail her this design critique?” But then again, I might not want the entire office, my mom or – even – my friends to be aware of my mood all the time, be it joyous or sad. So, I’ll pass for the automated mood sharing to social networks, or my mood being constantly visible to all through Glass. A coloured mood indicator during Hangouts, I might just be willing to give that a try.
Mood-enhanced applications – SoLoMo is so last year, and no single system can be truly ‘context-aware’ if it does not take its user’s mood into account. Auto-adjusting playlist, the car that switches itself from ‘Speed’ to ‘Comfort’ driving preferences, and – ultimate fantasy – a freezer that checks if there’s ice cream still when I’m feeling a bit blue.
From these, making and receiving phone calls is the strongest indicator for determining mood, with application usage by category as runner up. Emails and text messages come in third. Phone calls, categorized applications, and locations are often significant indicators of pleasure, appearing 55% of the time as positive. Phone calls also tend to indicate an active mood (for example, tense or excited), where as many emails incoming & outgoing are more likely to translate to a negative activeness – a bored or calm mood.
Of course there are a number of other variables which might affect a user’s mood and can not be directly interfered from a user’s smartphone usage – face-to-face arguments, stressful traffic conditions, the music your coworkers are playing in the office, and the weather, to name a few. There is a long way to go still before computers will be fully and flawlessly able to detect how we’re feeling, but MoodScopes’ research shows again that there is so much opportunity already with the technology each and every one of us already own, and more of it emerging every day.
Especially if, on top of the ‘user behaviour measurements’, we add in some more context – weather APIs, traffic APIs, a simple ‘walking’ or ‘in a car’ user state, acoustic fingerprinting (a tad less battery friendly though), Facebook updates – you start to realize the future is not half as far away as the average science fiction writer would like you to believe.
If you get all the building blocks for constructing a ‘context-aware’ and ‘mood enabled’ app, what new technology would you invent? Join the discussion at the summerofcontext.com, and tell the Argus Labs context captains what you need.