The field of Emotional Data is becoming increasingly important for both tech and creative industries. Driven by recent developments in Machine Learning and AI-techniques, we can now make computers identify and process human feelings and emotions with greater precision than ever before. Companies and the market have an obvious interest in understanding and managing how we feel online toward different things and situations.
But what challenges are we facing when computers are able to identify the emotional content from real time gestures and expressions, and in images and sound? What happens, for instance, when the digital assistants of our smart devices, like Apple’s Siri or Amazon Echo's Alexa, can predict our information- and buying needs based on the tone of our voice? And what does it mean to basic human communication, when our social relationships are guided by algorithms that determine which friends make us feel the best?
These are some of the topics that will be introduced and discussed in this talk. Different examples of experimental, artistic, and leading-edge projects on Emotional Data will be presented, including recent field observations from art and tech labs in Montreal and Boston.