Viv, the future of Intelligent Agents and Empathic Data Systems (#WiC)
This week, Viv and the future of intelligent agents, as well as an empathic data systems that adjusts its visualisation of large data sets to what if feels – senses – ‘we can handle’. There’s also computers as story tellers, Google’s Fit SDK release and the SexFit wearable. Talks and papers on context fusion behaviour analysis ontologies, automotive typography design and security concerns in Android mHealth apps.
This Week in Context
Your Weekly Update on All Things Context, August 15 2014
“The Rift succeeds in creating presence because it brings multiple important sensory cues into “sync”, and this tips our sensorimotor system into a new posterior mode. That is, the most likely explanation that our brain is able to find for the sensory input it is receiving corresponds to the one provided by the simulation – and once the brain “buys in”, it does so enthusiastically.”
– Beau Cronin
‘The Oculus Rift Is An Applied Neuroscience Powerhouse’
Featured: Viv & the Future of Intelligent Agents
Viv Labs is working on an advanced form of AI, that will be able to teach itself, giving it almost limitless capabilities. Viv – their ‘Samantha’ – will be able to use your personal preferences and a near-infinite web of connections to answer almost any query and perform almost any function. Their aim is not for an artificial intelligence that is not only blindingly smart and infinitely flexible but omnipresent. Viv should become an utility that ‘lives’ in your laptop, your phone, your car, … .
Opposed to Siri and Google Now – who’s functionally is limited by what they are programmed for, Viv can generate its own code on the fly. Wired explains:
Take a complicated command like “Give me a flight to Dallas with a seat that Shaq could fit in.” Viv will parse the sentence and then it will perform its best trick: automatically generating a quick, efficient program to link third-party sources of information together—say, Kayak, SeatGuru, and the NBA media guide—so it can identify available flights with lots of legroom. And it can do all of this in a fraction of a second.
The Collective Experience of Empathetic Data Systems
CEEDS – the Collective Experience of Empathetic Data Systems, a consortium of 16 different research partners across nine European countries – are developing an interactive system which not only presents data the way you like it, but also changes the presentation constantly in order to prevent brain overload.
The BrainX3 system uses virtual reality to enable a user to ‘step inside’ large datasets, and also contains a panoply of sensors which allows the ‘immersive multi-modal environment’ to present the information in the right way to the user, constantly tailored according to their reactions as they examine the data. These reactions – such as gestures, eye movements or heart rate – are monitored by the system and used to adapt the way in which the data is presented.
Computers Are Not Good Storytellers
Whilst Netflix crunching data about what makes a series that we love to watch might have had a lot to do with House of Cards being an instant hit, bringing a good story is not machine’s forte, as these AI generated narratives show. (This reminds me of Julie Steel’s talk about data science and story telling: “Data science without telling a story is like building bridges in the ocean.” Het O’Reilly talk on Storytelling in the age of Big Data is now available on YouTube.)
Google releases Fit SDK
Google has released a preview Fit software development kit (SDK) that developers can use to start developing apps. Devs get access to three sets of APIs: Sensor APIs for access to biometrics sensors, a Recording API lets an app collect collect user fitness data from the cloud, and a History API that allows apps to read, insert, or delete user health data in the Google Fit cloud.
SexFit: Quantified Penis
The SexFit, on the other hand, is a rather personal wearable. It will come with built-in wifi & bluetooth connectivity, vibration motor and, of course, an accelerometer. It syncs with your smartphone so you can check reports and calories burned, and for more real-time reporting, PCMag writes, there is a series of performance light indicators: “Yes, it comes with a series of ‘performance light indicators,’ in case you need to look down and get a visual reminder of just how you’re doing in the sack. Maintain a good rhythm, and all five lights will glow nice and bright.” The SexFit should hit the shelves somewhere in 2015.
Papers, Talks & Research
- Ontology Based Context Fusion For Behaviour Analysis and Prediction (activity recognition, context-awareness, semantics, paper)
- Feel Effects: Enriching Storytelling with Haptic Feedback (haptics, output, paper)
- A User-Study on Context-aware Group Recommendation for Concerts (recommendation engines, context-awareness, paper)
- An Extraction Method of Acoustic Features for Music Emotion Classification (emotion detection, music, paper)
- Security Concerns in Android mHealth Apps (security, android, mHealth, paper)
- Are We There Yet? Feasibility of Continuous Stress Assessment via Wireless Physiological Sensors (emotion detection, sensors, ubiquitous computing, paper)
- From Talking and Listening Robots to Intelligent Communicative Machines (intelligent computing, AI, excerpt)
- Balancing the design and science of typography for the automotive space (design, typography, automotive, webinar recording)
Enjoy the weekend, see you next week!
(Well, if you subscribe to the Week in Context here.)