The streets of Thanjavur, a city steeped in South Indian culture, come alive with vibrant displays of Thalaiyatti Bommai – the head-shaking dolls. More than mere toys, these handcrafted figures embody a fascinating aspect of Indian nonverbal communication: the head nod. While many other cultures often interpret a head nod as a simple "yes" or "acknowledgment," the Indian head wobble is beyond that. This seemingly simple gesture carries layers of meaning and cultural impact depending on context, speed, and intensity.
Owing to the distinct power relations and hierarchies in Indian society, open disagreement is often considered as a disrespectful gesture. Thus, the head nod became a way to navigate "grey areas," acknowledging someone's point without committing to complete agreement. This ability to both deliver and interpret the head nod's subtleties is a practice across India. While variations exist in the head wobble depending on the region, Indians keenly understand the unspoken language conveyed through this gesture.
Across the globe, distinct cultural groups have developed intricate methods for navigating conversation and expressing emotions. These methods encompass a spectrum of nuances influenced by age, gender, socioeconomic status, personality traits, and even regional variations. As advancements in Artificial Intelligence (AI) accelerate, companies increasingly promote tools with the capability to detect user emotions by analysing a combination of linguistic cues (the content of the message), paralinguistic cues (vocal delivery), and nonverbal gestures.
A subfield of AI, Emotion AI, has found applications in various industries. Emotion AI primarily relies on facial expression recognition. However, this translates to an underlying assumption that all humans exhibit similar emotions in similar or non-similar circumstances. The Indian head nod, for example, demonstrates the cultural variability in nonverbal communication. While AI can achieve some level of accuracy in recognising these patterns, it cannot fully grasp the subtleties and reciprocate in a truly human-like manner. This is because AI lacks the cognitive and emotional complexity that defines human experience.
Regardless of the innate human qualities that AI lacks, people tend to desire human-like reciprocity in their communications with AI. Text-based interactions often feel cold and impersonal, lacking warmth and personality. Attempts at voice interaction can result in stilted, emotionless deliveries, further highlighting the gap between expectation and experience. This desire for human-like responses from AI stems, in part, from the pervasive narrative that AI can replicate or even surpass human capabilities. This aligns with the concept of anthropomorphism, the tendency to attribute human qualities to non-human entities.
Developmental Psychologist Jean Piaget identified a similar phenomenon in children. During their early stage, children naturally ascribe life and sentience to inanimate objects. While this perspective usually fades with cognitive development, the allure of anthropomorphism persists in our interactions with AI. Advancements in artificial intelligence often trigger the unconscious assumption that these machines possess human-like emotions and thought processes. This tendency can lead to unrealistic expectations regarding AI's ability to understand and respond to the complexities of human communication, as exemplified by the nuances of the head nod and cultural variations.
With the growth of AI, the focus often shifts towards developing tools that can remarkably replicate human behavior. This emphasis on human-like interaction can inadvertently fuel the tendency towards anthropomorphism. A case in point is Hume AI, which is set to launch the world's first "emotionally intelligent voice AI" in April 2024 – the Empathic Voice Interface (EVI). This technology is positioned as a solution to user complaints regarding the robotic nature of AI voices. EVI utilises a WebSocket connection to facilitate real-time, two-way dialogue. Users can speak naturally, with EVI analysing their voice and expressions to generate responses that are supposedly imbued with emotional intelligence. Hume AI positions itself as a developer of AI models that foster empathetic communication and contribute to user well-being.
A recent development in Kerala further exemplifies this tendency towards anthropomorphism. A school in Thiruvananthapuram district unveiled India's first AI teacher, Iris. Designed to resemble a human teacher, complete with a saree and jewellery, Iris can move around the classroom and interact with students in three languages. While this technology holds promise for enhancing education, it's crucial to acknowledge the potential pitfalls. Presenting such a human-like figure to children can further blur the lines between humans and AI, potentially leading to unrealistic expectations about AI's capabilities and emotional intelligence.
Research on Human/AI Relationships suggests that humans have a tendency to mistake AI personalities for independent, individual entities. However, this is an illusion. The personality of an AI model project is carefully crafted based on the design principles and goals set by its creator companies. This tendency towards anthropomorphisation can lead to the development of various unexpected relationships with AI, ranging from dependence and companionship to even romantic attachments. With rapid developments in AI, it can eventually become adept at identifying culturally embedded patterns of communication, like head nods. However, it will forever lack the actual ability to communicate. This reality is often masked under the profit motives of many developing companies. Thus, it can result in humans developing a "real" relationship, even when the opposite party is not a sentient being but a complex algorithm. People perceive a connection that isn't truly there. As this debate on one-sided emotional investment with AI continues, we know the world contemplates on producing a Klara, the Artificial Friend of Josie in Ishiguro’s novel “Klara and the Sun” who truly loved her friend and was willing to sacrifice her safety and wellbeing to make Josie better.
Comments