This Week in Context – 20140307

Your Car on iOs. Or Android. Or Linux.

On Monday, Apple announced its new in-vehicle infotainment system ‘CarPlay’. One of the IVI’s key features? Advanced, context-aware intelligence. Apple’s CarPlay will use information already on your other iOS devices – email, text messages, contacts and calendars – to predict where you most likely want to go: recommended addresses, rather than having to input your destination yourself. And of course, with CarPlay, Siri’s there to help you if you need to place a call or dictate a text. So you can keep your eyes on the road, and hands on the steering wheel. For starters, you’ll be able to have CarPlay if you buy a Ferrari, Honda, Hyundai, Mercedes-Benz or Volvo vehicle.

apple-ios-carplay

The Android equivalent? That’s in the works too, and I imagine it to be a Google Now of sorts. As we’ve mentioned in January, Google’s teamed up with Audi, GM, Honday, Hyundai and NVIDEO to form the Open Automative Alliance, and Cadillac, Toyota and Tesla are experimenting with Linux-based IVIs.

CarPlay probably means the end of the automotive industry’s manufacturer-branded IVIs, and another lock-in for users (if your car’s system runs on iOS, you’d want a matching phone), but this might just be outweighed by the benefit: a fluent and uniform experience across devices (from phone to car), no longer constantly syncing data, and – most importantly – no need to learn to work yet another interface.

(Fascinated by high-tech cars and the Internet of Things? Make sure you read about how the Tesla “fixed its cars over the air”, saving their customers the hassle of having to take their cars to the dealership for a much-needed software fix.)

wheniamherefeelingfine-emotiondetection

In the Mood for Emotions

Being aware of a person’s context, also includes being aware of his or her mood. Because knowing my mood means knowing how to interact with me. It means being aware of what I’d like to listen to, read, or even eat. Knowing – or sensing – my mood and emotions, means that you can deliver to me exactly what I want. Which is a pretty neat superpower to have, for both the music streaming industry and retail:

Music data intelligence platform The Echo Nest has been acquired by Spotify.  The Echo Nest is known for powering multitude of music discovery applications, matching music to moods of sorts (dreamy, bouncy, gloomy) and audio fingerprinting. Spotify will look to leverage The Echo Nest’s musical understanding and tools for curation, not only to drive better music discovery for users, but also for brands and partners to build better music experiences for their audiences, The Next Web’s Emil Protalinski writes.

Meanwhile, Emotient has closed a $6 million series B round, and is opening up a private beta for Google Glass that allows for detecting user’s emotions in real time. It will first be tested in retail, to measure the success of campaigns or promotions. The software works by detecting subtle facial expression pattern changes, and can measure sentiment (positive or negative) as well as emotions such as joy, sadness, suprise, anger, fear, disgust and contempt of individuals or a crowd. More on Emotient on factcompany.com. (Via Tine)

Further readings:

  • The ‘Creation of a Habit Model from GPS Data and Algorithms for Providing Awareness Services’ by Nobuo Matsuo and Kazumasa Takami, on how to best use ‘big but personal data’ to serve context-aware reminders to users. The paper is part of the World Comp proceedings. 
  • “For the next wave of wearables, context will be king”, IBM’s Paul Brody writes on Gigaom. We fully agree that for wearables – and technology  in general – to be smart, it should know about your day without having to ask. Yet, such intimacy between technology and a users does not necessarily need to generate solely from more independently intelligent wearables. It should also originate from smarter use of much of our human-to-computer interaction data already out there.  The way you handle your phone – the apps you use, your typing pattern, .. – can tell much about how you feel, and what you need your device to (not) do. (Submitted by Marco)
  • Marco also suggested we’d listen to Gigaom’s Internet of Things podcast with Jason Jacobs, CEO of Runkeeper.  In “Will the smartphone eat the fitness tracker market?”, Jacobs discusses the wearables market, the future of fitness trackers, and how to build a business using data.
  • Jawbone released two new apps. The revamped Up app now draws correlations between your activity patterns and restfulness, and proactively suggest what you could do to get more quality sleep. As you can’t think of (no) sleep, without associating caffeine, Jawbone is also rolling out Up Coffee. The app lets you – manually – track your caffeine input, and see when it’s out of your system.  The Up Coffee app is free, and currently exists only for iOS. (Maarten spotted this one.)
  • On the O’Reilly rader, Andy Oram has a look at the hurdles standing in the way of the Internet of Things promise, which prove to be more social than technical. In this overview, Oram asks the right questions – and brings up standardisation, personal autonomy, data collection, employment, and trust.

Leave a Reply

Contact us
×