As companies continue to explore the potential of artificial intelligence (AI) in various aspects of their operations, one emerging trend that has caught attention is the use of AI to help bots understand human emotions. This field, known as “emotion AI,” is predicted to gain momentum in the coming years, according to PitchBook’s Enterprise SaaS Emerging Tech Research report.
The logic behind this trend is quite simple: if businesses are deploying AI assistants and chatbots as front-line representatives for sales and customer service, it is important for these AI systems to be able to understand and respond appropriately to human emotions. Traditional sentiment analysis, a pre-AI technology, attempted to decipher human emotion from text-based interactions, particularly on social media. However, emotion AI aims to take this a step further by employing a multimodal approach that combines sensors for visual, audio, and other inputs with machine learning and psychology to detect human emotions during interactions.
Major AI cloud providers, such as Microsoft Azure and Amazon Web Services, offer services that provide developers with access to emotion AI capabilities. For example, Microsoft Azure’s cognitive services include an Emotion API, while Amazon Web Services offers the Rekognition service. Although emotion AI as a cloud service is not new, its proliferation in the business world has been driven by the increasing presence of AI-powered bots in the workforce.
According to PitchBook’s report, emotion AI promises to enable more human-like interpretations and responses in AI assistants and automated human-machine interactions. The hardware side of emotion AI relies on cameras and microphones, which can be integrated into laptops, phones, or physical spaces. Moreover, wearable hardware is likely to provide additional avenues for applying emotion AI beyond these devices. This suggests that future customer service chatbots may request camera access to enhance their ability to analyze human emotions.
Recognizing the potential of emotion AI, several startups have entered the scene. These include Uniphore, MorphCast, Voicesense, Superceed, Siena AI, audEERING, and Opsis, each of which has raised modest sums from various venture capital firms. Their aim is to develop solutions that enable AI systems to understand and respond to human emotions more effectively.
However, despite the growing interest in emotion AI, there are certain challenges and limitations associated with this technology. In the past, researchers have raised doubts about the effectiveness of emotion AI in determining human emotion based on facial movements. A meta-review of studies conducted in 2019 concluded that facial expressions alone cannot accurately determine human emotions. This challenges the assumption that AI systems can detect emotions by mimicking how humans read faces, body language, and tone of voice.
Furthermore, the regulatory landscape surrounding AI may pose obstacles for emotion AI. For instance, the European Union’s AI Act prohibits the use of computer-vision emotion detection systems for certain purposes, such as education. Some state laws, like Illinois’ Biometric Information Privacy Act (BIPA), also prohibit the collection of biometric readings without explicit permission. These regulations may limit the scope of emotion AI applications, particularly in sensitive areas like education.
Considering these challenges, the future of emotion AI remains uncertain. While some AI bots may develop automated empathy to perform tasks such as customer service, sales, and HR, it is uncertain whether this solution will be effective. There is a possibility that AI systems may not possess the capacity to excel in tasks that require genuine emotional understanding. In such a scenario, offices may be filled with AI bots that mimic the functionality of Siri in 2023. It becomes a question of whether it is worse to have a bot guessing everyone’s feelings during meetings, or to have a Siri-like bot that lacks emotional intelligence.
In conclusion, emotion AI is gaining traction as businesses look for ways to enhance the capabilities of their AI systems. The ability to understand and respond to human emotions is seen as a crucial aspect of AI’s integration into various roles within organizations. However, there are challenges associated with emotion AI, including the limitations of facial expression analysis and regulatory restrictions. These factors raise questions about the effectiveness and feasibility of emotion AI in practice. Nonetheless, the ongoing development of this technology offers an intriguing glimpse into the future of AI and its potential impact on human-machine interactions.
Source link