Affective computing, emotion-aware machines, ambient sensing, … All words that get thrown around alot these days, but what do they really mean? Three Argus Labs employees set out to create a clear and fun example of what these could look like in a one-evening coding session. Check out our lights that listen for the office mood.
Place: Argus Labs HQ, Antwerp, Belgium
Team: Ben, Roel and David
Gear: A Raspberry Pi, a Wolfson Audio Card and Philips Hue starter pack (containing a bridge and two lightstrips)
Goal: Lights that listen to your mood
Philips Hue was a breeze to set up: connect the bridge to our LAN and power it and the strips up. We then wrote a small Ruby wrapper around their API with functions to alter the strips’ colors, complete with fade in/out transitions before the actual change.
OpenEAR, the open source emotion and affect recognition toolkit, was our affect recognition software of choice for the evening. We set it up to analyse what we’re talking about in the office, and mapped the resulting affects to a color. This color is then sent to the Philips Hue using our wrapper.
Our current choice of hue:
class AffectMapper def self.color(affect) case affect when 'aggressiv' return 1 when 'cheerful' return 40000 when 'intoxicated' return 10000 when 'nervous' return 35000 when 'neutral' return 65535 when 'tired' return 20000 end end end
Two shiny led strips that immerse the room in a colourful glow matching what we’re saying, and how we’re saying it:
The next step is to get this running on the Raspberry Pi using the microphone on the Wolfson Audio Card, so we can install this in our lounge for permanent ambient lighting without requiring a laptop.
Once we have this persistent setup in place, we can extend the functionality of our hack. Throwing a few extra lightstrips into the mix, influencing their colours based on what we’re saying on HipChat or what music we’re listening to, and much more when we’re feeling adventurous.