Empathy is the ability to put ourselves in someone else’s shoes. With the rise of Artificial Intelligence and smarter products that can complete more complex tasks, a desire to tackle empathy will be crucial. As the complexity of a problem increases, so does the level of requirements different humans may have. Artificial Intelligence services will need to grow to understand that humans are multidimensional, or they will not be able to transform and differentiate themselves from the products and services of today.
If you own an Alexa, I am sure at some point you have muttered something completely unrelated, only for Alexa to chime in and play Coldplay as a suggestion. We’re now surrounded by hyper-connected smart devices that are autonomous, conversational and relational, but they’re completely devoid of any ability to tell how annoyed, happy or depressed we are. The problem is such services do not understand your emotions and so cannot react to your shouting, no matter how much you hate Coldplay.
Artificial Empathy could be the answer to this. What if systems and products sensed nonverbal behaviour in real time? Your car might notice that you’re tired and take the wheel. Your fridge may work with you on a healthier diet. Your wearable fitness tracker and TV might team up to get your off the sofa and other products would start to sense changes in your mental health.
The basis of these technologies is already here. Facial tracking can detect whether you are smiling or frowning. Image recognition can estimate your body-mass-index. That extra understanding of what these things may mean to people and what is going to suit the individual best are the final piece of the puzzle.