PSA Group is using Sentiance technology to personalize the in-car experience

connected car, connected car technology, connected car iot

BARCELONA (SPAIN) – From 27 February to 2 March 2017, SENTIANCE technology can be seen in action. The PEUGEOT booth offers visitors of the Mobile World Congress (MWC) the opportunity to discover the world’s first demonstration of its autonomous and connected vehicle: PEUGEOT INSTINCT CONCEPT, using SENTIANCE technology. PEUGEOT will showcase how they use artificial intelligence…

A journey of a thousand miles begins with a single step…

Discover the capabilities of the Sentiance contextualization platform by downloading Journeys. Journeys is our training, data gathering and demo application. The app automatically creates your personal behavioral profile based on your activity patterns throughout the day, all just by analyzing your phone’s sensor data. Download our improved Journeys app and discover the capabilities of sensor…

Why it’s time to put the real U in UBI

Introduction UBI should really stand for User-Based Insurance, not Usage-Based Insurance.  This small change in definition would immediately expand the very narrow view of what UBI should mean for car insurers.  UBI as defined today should actually be called VBI – Vehicle Based Insurance.  Most insurers only take into account how people handle their vehicle….

The semantics of time, why my morning isn’t necessarily yours

Targeting your users in the right context also means engaging them at the right time and moment. While some interactions make sense during a user’s morning routine, others will have higher relevance over lunch or during the evening. Time defines context as different parts of the day trigger different needs and wants. However, time is…

A day in the life of …

Last year, Flowingdata came up with a dynamic visualization of the average day in the life of Americans. This visualization was based on data from the American Time Survey, consisting of questions measuring the amount of time people spend doing various activities throughout their day. We at Sentiance thought this was a cool idea, so…

How AI drives the mobile contextual revolution

contextual intelligence, machine learning as a service, predictive analytics solutions, IOT sensor data

Introduction In “Predictions 2016: The Mobile Revolution Accelerates”, Forrester forecasted that by the end of 2016 more than 25% of companies will consider mobile not as a channel, but as a fully integrated part of their overall strategy. Moreover, the research firm expects customer-focused companies to take personalization to the next level by extracting relevant…

Sentiance Raises $5.2 million in Series B Financing Round Led by Samsung Catalyst Fund

ANTWERP (BELGIUM) November 17, 2015 – Belgian startup Sentiance today announced that it has closed $5.2 million in its most recent round of funding. The new investment enables Sentiance to enhance the company’s focus on the Internet of Things. Sentiance provides an intelligent software layer that converts mobile data into “smart life” opportunities for end-users…

Sentiance announces Smart Cities Alliance with the acquisition of i-KNOW (UGent) MOVE IP and Platform

toon vanparys frank verbist and sidharta gautama

Sentiance is excited to announce that it will be entering a research and operational alliance with the Innovation Centre i-KNOW of Ghent University, that will combine their expertise in mobile sensing, machine learning and intelligent mobility to create a partnership that is an innovation leader in next generation technology solutions in the mobility, health, retail,…

Advertising Technology Expert Frank Verbist on Emotion-Aware Mobile Marketing

Digital advertising technology enthusiast Frank Verbist has twenty years of experience in technology development and startups. Six of these he worked at IgnitionOne, leading the team that built one of the first User profiling and Real Time Bidding platforms. Frank is co-founder of early-stage technology investment fund Strike4, and member of the board for mobile…

Mood for Music: Emotion Recognition on Acoustic Features

Our emotional response to a music fragment depends on a large set of external factors, such as gender, age, culture, and context. However, these external variables set aside, humans consistently categorise songs as being happy, sad, enthusiastic or relaxed. We’ve developed an algorithm that knows how a song will be emotionally perceived.