Emotional Intelligence in Artificial Agents: Leveraging Deep Multimodal Big Data for Contextual Social Interaction and Adaptive Behavioral Modelling

Authors

  • Jeevani Singireddy Sr. Software Engineer, Intuit Inc
  • Botlagunta Preethish Nandan SAP Delivery Analytics, ASML
  • Phanish Lakkarasu Sr Enterprise Application Developer
  • Venkata Narasareddy Annapareddy Sr Enterprise Application Developer
  • Jai Kiran Reddy Burugulla Senior Engineer

DOI:

https://doi.org/10.63278/1489

Keywords:

Human-Centered Intelligent Systems, User-State Modeling, Socially Adaptive Agents, Socio-Emotional Intelligence, Multimodal Interaction Analysis, Artificial Social Agents, Autonomous Agent Behavior, Context-Aware Intelligence, Human-Agent Co-Performance, Virtual Avatars, Social Robots, Emotional Expression Recognition, Interaction Context Awareness, Affective Computing, Deep Behavioral Modeling, Adaptive Virtual Assistants, Online Social Environments, Gaming AI, Customer Service Bots, Mental State Inference.

Abstract

As artificial agents develop beyond mere tools and begin to perform roles traditionally associated with humans, expectations of their performance are equally evolving. Not only must agents be able to accomplish their tasks; but they must also be able to do so in a manner that observers would consider socially or contextually appropriate. For social interaction where the agent and human are co-performers, adherence to social cues that signal emergent aspects of a relationship such as intimacy or status is paramount to the experience of the interacting humans. For autonomous agents who function alone, adaptive behavioral modeling and user state awareness are critical to the impact of the agent’s actions on humans. Such contextual social behavior is a requirement for complex applications including physically located social robots, virtual avatars emerging in gaming, online social environments, or customer service interactions, and proactive virtual assistants. Humans have sophisticated socio-emotional capacities that enable them to behaviorally coordinate their interactions with others, inferring mental states that may lie far beyond explicit observable cues. Furthermore, emotional expressions are multimodal and are the result of a complex interaction between inherent affective states and contextual interaction. The Human Centered Intelligent Systems conceptual framework describes a pathway whereby artificial agents may also achieve aspects of this intelligence through rich user state modeling based on deep multimodal analysis of big data that can capture the social behavior and interaction context. In this chapter, we describe this "user-state" modeling approach and exemplify its applicability to a spectrum of agent applications.

Downloads

How to Cite

Jeevani Singireddy, Botlagunta Preethish Nandan, Phanish Lakkarasu, Venkata Narasareddy Annapareddy, and Jai Kiran Reddy Burugulla. 2025. “Emotional Intelligence in Artificial Agents: Leveraging Deep Multimodal Big Data for Contextual Social Interaction and Adaptive Behavioral Modelling”. Metallurgical and Materials Engineering 31 (4):599-615. https://doi.org/10.63278/1489.

Issue

Section

Research