In the future, when machines will understand our emotions!

The evolution of human-machine relations is at the heart of many questions with the democratization of artificial intelligence. How can we imagine such a future?

Artificial intelligence is one of the hottest notions currently, widely discussed at the South by Southwest Music Conference and Festival (SXSW) currently taking place in Austin, Texas (February 10-19, 2017). Its best-known applications to the general public are inevitably connected assistants and other chatbots that are gradually becoming part of our daily lives.

Josselin Moreau, innovation expert at Lab SQLI, indicates in an article published in Les Échos on March 14, 2017 that AI intended for the general public “may well change the way we interact with machines. » The interested party believes that this is still not very noticeable in France and puts forward a relevant argument by explaining that in the United States, nearly 20% of homes are already equipped with Echo connected speakers.

The creator of the Siri assistant, Adam Cheyer, believes that after the era of the computer and then that of the mobile phone, we « let’s enter a new mode of human/machine interaction » can lead to a « symbiosis between human intelligence and artificial intelligence ». Although we are only at the beginning of establishing such a standard, complex human voice interactions are riddled with subtleties. It will therefore still take quite a while to arrive at such a symbiosis since, unlike humans, the machine cannot (yet) express itself intuitively.

Substantial progress has nevertheless been made with artificial intelligence in terms of language recognition, in other words the ability of machines to make sense of the words spoken by humans. However, Sophie Kleber, executive director at Huge, which defines itself as a digital agency, thinks this is not enough. Nevertheless, with the emergence of Affective computing, we will enter another dimension.

Sophie Kleber explained this during her conference Designing emotionally intelligent machines, indicating that to establish strong relationships with humans, machines will have to understand human emotions, interpret them, use them, but also simulate them themselves. In the list of subtleties not perceptible by words alone: ​​there is voice, modulations, body language or even micro-expressions. There are three ways to achieve this, namely visual recognition, voice recognition and biometrics. While the former is at an advanced stage, the latter is proving to be the area requiring the most progress to be made.

Sources: Les Echos – Le Monde

Laisser un commentaire