This week the possibilities offered by wearables, facial detection & emotion recognition, the Internet of Things, and the replacement of NFC by low-energy Bluetooth piqued our interest. As usual, we were also intensively working on our context platform and the Summer of Context. If we’ve missed some important ‘contextualization’ news, bring us up to speed on Google+ or Twitter, or here in the comments.
- No, you can’t recreate the Moto X with Apps
Geek.com’s Russel Holly explains why third-party apps alone can’t recreate all of the Moto X features. apps like Tasker, Dynamic Notifcations, Open Mic+ and Utter are neat, but not a full replacement for the Moto X’s ‘context-awareness’. Although these mobile apps consult the same basic sensors (light, proximity, microphone, …), Moto X’s has sensors that are optimized to detect more than just ‘what’s light like so I can adjust my screen brightness’ and designed especially for ubiquitous listening without too much battery drain. (via Marco)
- How Apple, Broadcom and Y Combinator will put a bullet in NFC’s head on September 10th
Can Low-Energy Bluetooth start a new phase in the context revolution? It for sure has several characteristics (low battery drain, low cost, sensor capabilities, …) that can drive novel user experiences. (David)
- Qualcomm Ventures Managing Director Quin Li is Excited About Wearables
In an interview at the TechCrunch Meetup + Pitch Up, Qualcomm Venture’s Quin Li explains he Li sees wearable devices as another extension of ‘data crowdsourcing’. Currently, this is mainly done through smartphones loaded with apps such as Waze and OpenSignal’s WeatherSignal, but if the folks creating these wearable devices can appeal to the mainstream, there’s no reason data won’t soon come from our socks.
- Google Glass app that reads human emotions could be used to improve the lives of people with autism
This article is a good example of how ‘privacy-invasive technologies’ definitely have their pro’s too. Of course, there’s an alternative ‘scary’ side to this: a world where every Glass you encounter automatically identifies you, logs your mood and submits that to a central database. Yet in general, I believe the world would become a better place if we were confronted with our own and other people’s feelings more often; and if machines are taught to take ours into account. Interesting quote from Beyond Verbal‘s Dan Emodi at leveling up Siri’s EQ to make her more context, or emotion, aware: ‘If Siri understands not just what I say, but how I feel, it will come back with an answer that matches my mood. It’s adding a totally new dimension. It really could change the relationship we have between us and machines.’
We’ve also received some great feedback on our ‘Putting Music in Context’ whitepaper. If you want to find out how context-aware discovery can significantly improve the listening experience, download ‘Putting Music in Context’ for free.