Apple just bought Emotient, a startup that uses Artificial Intelligence to read people’s emotions by analyzing their facial expressions. Also, in October of 2015 it bought VocalIQ, another startup that uses speech technology to teach machines to understand the way people speak. As usual, Apple did not disclose the reasons behind the acquisitions.
Why is Apple interested in such technologies? To me the reasons are clear. If you have interacted with Siri, Apple’s virtual assistant for mobile devices, you have probably discovered how limited it is. My guess is that the company is trying to inject Siri with Artificial Emotional Intelligence (or some sort of Artificial Emotional Brain), in an attempt to make interactions with the system much more natural. The missing piece of the puzzle? Technology to understand not just our faces and tones, but the implicit emotions hidden in the meaning of hour words. If you haven’t yet, please have a look at my demo of Emotion Detection from Text.
You can read more about Apple’s acquisitions here