Patrick Buckley The Hologram Concert – How AI is helping to keep music alive When Whitney Houston passed away in 2012, the world was shocked by the sudden and tragic news of her death. Fans gathered around the Beverly Hills hotel in Los...
Patrick Buckley Robot Waiters – The future or just a gimmick? As we continue to battle the COVID-19 pandemic, the hospitality industry is looking to technology as a way to keep workers safe. Could robot waiters be the answer? In...
Olivia Brookhouse The Big Data behind Black Friday It is often the products that seem to have “come out of nowhere” that suddenly experience rocket sales and have everyone talking. Whether its health fads such as...
LUCA Cycling in the city? Analysing Cyclist Safety in Madrid with Excel Written by Paloma Recuerdo, original post in Spanish In this post we will see an example of how we can perform simple descriptive analytics on a set of data, without...
LUCA Deep Learning and satellite images to estimate the impact of COVID19 Motivated by the fact that the Coronavirus Disease (COVID-19) pandemic has caused worldwide turmoil in a short period of time since December 2019, we estimate the negative impact of...
LUCA Success Story: LUCA Transit and Highways England The transport industry is very receptive to the application of Big Data and Artificial Intelligence strategies, as there are clear use cases that can maximize a companies’ efficiency and...
LUCA How much do London firefighters spend on saving helpless kittens? The Open Data answers Original post in Spanish by Paloma Recuerdo In this second post, a continuation of “Smart Cities: Squeezing Open Data with Power BI”, we will analyse the problem and understand the measures...
LUCA How Germany moves Our colleagues in Telefonica NEXT have created, using mobility data, data analysis and big data, an interactive map that lets you visualize nationwide traffic flows. Cities, transport companies and...
Can Artificial Intelligence understand emotions?Olivia Brookhouse 17 April, 2020 When John McCarthy and Marvin Minsky founded Artificial Intelligence in 1956, they were amazed how a machine could perform incredibly difficult puzzles quicker than humans. However, it turns out that teaching Artificial Intelligence to win a chess match is actually quite easy. What would present challenges would be teaching a machine what emotions are and how to replicate them. “We have now accepted after 60 years of AI that the things we originally thought were easy, are actually very hard and what we thought was hard, like playing chess, is very easy”Alan Winfield, Professor of robotics at UWE, Bristol, Social and emotional intelligence come almost automatically to humans; we react on instinct. Whilst some of us are more perceptive than others, we can easily interpret the emotions and feelings of those around us. This base level intelligence, which we were partly born with and partly have learnt, tells us how to behave in certain scenarios. So, can this automatic understanding be taught to a machine? Emotion AI Although the name may throw you off, Emotion AI does not refer to a weeping computer who has had a bad week. Emotion AI, also known as Affective Computing dates back to 1995 and refers to the branch of Artificial intelligence which aims to process, understand, and even replicate human emotions. The technology aims to improve natural communication between man and machine to create an AI that communicates in a more authentic way. If AI can gain emotional intelligence maybe it can also replicate those emotions. How can [a machine] effectively communicate information if it doesn’t know your emotional state, if it doesn’t know how you’re feeling, it doesn’t know how you’re going to respond to specific content” Javier Hernandez, a research scientist with the Affective Computing Group at the MIT Media Lab, In 2009, Rana el Kaliouby, and Picard founded Affectiva, an emotion AI company based in Boston, which specializes in automotive AI and advertising research. With customer’s consent, the user’s camera captures their reactions while watching an advertisement. Using “Multimodal emotion AI”, which analyses facial expression, speech, and body language, they can gain a complete insight into the individual’s mood. Their 90% accuracy levels are thanks to their diverse test-sets of 6 million faces from 87 different countries used to train deep learning algorithms. From a diverse data set, the AI will learn which metrics of body language and speech patterns coincide with difference emotions and thoughts. As with humans, machines can produce more accurate insights into our emotions from video and speech than just text. Sentiment analysis or opinion mining, a sub field of Natural Language Processing is the process of algorithmically identifying and categorizing opinions expressed in text to determine the user’s attitude toward the subject. This use case can be applied in many sectors such as Think tanks, Call centres, Telemedicine, Sales, and Advertising to take communication to the next level. Whilst AI might be able to categorize what we say into positive or negative boxes, does it truly understand how we feel or the sub text beneath? Even as humans we miss cultural references, sarcasm and nuance in language which completely alter the meaning and therefore the emotions displayed. Sometimes it is the things we leave out and don’t say which can also imply how we are feeling. AI is not sophisticated enough to understand this subtext and many doubt if it ever will. Can AI show emotion? In many of these use cases, such as Telemedicine Chatbots and Call Centre virtual assistants, companies are investigating the development of Emotion AI to not only understand customers emotions but to improve how these platforms individually respond. Being able to simulate human like emotions gives these platforms and services more authenticity. But is this a true display of emotion? AI and neuroscience researchers agree that current forms of AI cannot have their own emotions, but they can mimic emotion, such as empathy. Synthetic speech also helps reduce the robotic like tone many of these services operate with and emit more realistic emotion. Tacotron 2 by google is transforming the field to simulate humanlike artificial voices. So, if machines, in many cases, can understand how we feel and produce a helpful, even ‘caring’ response, are they emotionally intelligent? There is much debate within this field if a simulation of emotion demonstrates true understanding or is still artificial. Functionalism argues that if we simulate emotional intelligence then, by definition, AI is emotionally intelligent. But experts question whether the machine truly “understands” the message they are delivering and therefore a simulation isn’t a reflection that the machine is actually emotionally intelligent. Artificial General Intelligence Developing an Artificial General Intelligence, which possesses a deeper level of understanding is how many experts believe machines can one day experience emotions as we do. Artificial General Intelligence (AGI) opposed to Narrow intelligence refers to the ability of computers to carry out many different activities, like humans. Artificial Narrow intelligence as the name suggests aims to complete individual tasks but with a high degree of efficiency and accuracy. When we talk about emotional and social intelligence, forms of intelligence which are not necessarily related to a set task or goal, these fall under Artificial General Intelligence. AGI aims to replicate our qualities which to us, seem automatic. They are not tied to an end goal, we do them just because we do. Conclusions We are still many years behind having an Artificial General Intelligence capable of replicating every action we can perform, especially those qualities which we consider most human, such as emotions. Emotions are inherently difficult to read and there is often a disconnect between what people say they feel and what they actually feel. A machine may never get to this level of understanding but who is to say how we process emotions is the only way. How we interpret each other’s emotions is full of bias and opinion, so maybe AI can help us get straight to the point when it comes to our emotions. To stay up to date with LUCA, visit our Webpage, subscribe to LUCA Data Speaks and follow us on Twitter, LinkedIn o YouTube. Rebuilding Mocoa with Big DataWhat is IoT: Solutions and future applications
Patrick Buckley The Hologram Concert – How AI is helping to keep music alive When Whitney Houston passed away in 2012, the world was shocked by the sudden and tragic news of her death. Fans gathered around the Beverly Hills hotel in Los...
Patrick Buckley Robot Waiters – The future or just a gimmick? As we continue to battle the COVID-19 pandemic, the hospitality industry is looking to technology as a way to keep workers safe. Could robot waiters be the answer? In...
Patrick Buckley How will AI change the labour market for the better? From the way we shop, to the way we learn, the digital world in which we live is unrecognisable from the reality of a decade ago. One area which...
Patrick Buckley How AI is helping fashion retailers stay afloat With an estimated current global market value surpassing 406 billion USD, the fashion industry is one of the most significant yet vulnerable industries out there. In an ever-worsening socio-economic...
LUCA La transformación digital en la gestión del agua, ahora más que nunca Hoy en día mantenemos la incertidumbre de cuándo dispondremos de una vacuna o cuál será el impacto real en la sociedad y en la economía que nos deja esta...
Patrick Buckley Thanks to AI, the future of video-conferencing is in sight. Throughout the COVID-19 pandemic, video-conferencing has become the backbone of both our work and social lives. Today, on #WorldHugDay, we take a look at some of the ways in which...