What Happens When AI Commodifies Emotions?

Posted by
Check your BMI

The latest AI developments turn empathy into just another product for sale, raising questions about ethics and regulation.

 

SwissCognitive Guest Blogger:  HennyGe Wichers, PhD – “What Happens When AI Commodifies Emotions?”


 

SwissCognitive_Logo_RGB

toonsbymoonlight
Imagine your customer service chatbot isn’t just solving your problem – it’s listening, empathising, and sounding eerily human. It feels like it cares. But behind the friendly tone and comforting words, that ‘care’ is just a product, finetuned to steer your emotions and shape your decisions. Welcome to the unsettling reality of empathetic AI, where emotions and mimicked – and monetised.

In 2024, empathetic AI took a leap forward. Hume.AI gave large language models voices that sound convincingly expressive and a perceptive ear to match. Microsoft’s Copilot got a human voice and an emotionally supportive attitude, while platforms like Character.ai and Psychologist sprouted bots that mimic therapy sessions. These developments are paving the way for a new industry: Empathy-as-a-Service, where emotional connection isn’t just simulated, it’s a product: packaged, scaled, and sold.

This is not just about convenience – but about influence. Empathy-as-a-Service (EaaS), an entirely hypothetical but now plausible product, could blur the line between genuine connection and algorithmic mimicry, creating systems where simulated care subtly nudges consumer behaviour. The stakes? A future where businesses profit from your emotions under the guise of customer experience. And for consumers on the receiving end, that raises some deeply unsettling questions.

A Hypothetical But Troubling Scenario

Take an imaginary customer service bot. One that helps you find your perfect style and fit – and also tracks your moods and emotional triggers. Each conversation teaches it a little more about how to nudge your behaviour, guiding your decisions while sounding empathetic. What feels like exceptional service is, in reality, a calculated strategy to lock in your loyalty by exploiting your emotional patterns.

Traditional loyalty programs, like the supermarket club card or rewards card, pale in comparison. By analysing preferences, moods, and triggers, empathetic AI digs into the most personal corners of human behaviour. For businesses, it’s a goldmine; for consumers, it’s a minefield. And it raises a new set of ethical questions about manipulation, regulation, and consent.

The Legal Loopholes

Under the General Data Protection Regulation (GDPR), consumer preferences are classified as personal data, not sensitive data. That distinction matters. While GDPR requires businesses to handle personal data transparently and lawfully, it doesn’t extend the stricter protections reserved for health, religious beliefs, or other special categories of information. This leaves businesses free to mine consumer preferences in ways that feel strikingly personal – and surprisingly unregulated.

The EU AI Act, introduced in mid-2024, goes one step further, requiring companies to disclose when users are interacting with AI. But disclosure is just the beginning. The AI Act doesn’t touch using behavioural data or mimicking emotional connection. Joanna Bryson, Professor of Ethics & Technology at the Hertie School, noted in a recent exchange: “It’s actually the law in the EU under the AI Act that people understand when they are interacting with AI. I hope that might extend to mandating reduced anthropomorphism, but it would take some time and court cases.”

Anthropomorphism, the tendency to project human qualities onto non-humans, is ingrained in human nature. Simply stating that you’re interacting with an AI doesn’t stop it. The problem is that it can lull users into a false sense of trust, making them more vulnerable to manipulation.

Empathy-as-a-Service could transform customer experiences, making interactions smoother, more engaging, and hyper-personalised. But there’s a cost. Social media already showed us what happens when human interaction becomes a commodity – and empathetic AI could take that even further. This technology could go beyond monetising attention to monetising emotions in deeply personal and private ways.

A Question of Values

As empathetic AI becomes mainstream, we have to ask: are we ready for a world where emotions are just another digital service – scaled, rented, and monetised? Regulation like the EU AI Act is a step in the right direction, but it will need to evolve fast to keep pace with the sophistication of these systems and the societal boundaries they’re starting to push.

The future of empathetic AI isn’t just a question of technological progress – it’s a question of values. What kind of society do we want to build? As we stand on the edge of this new frontier, the decisions we make today will define how empathy is shaped, and sold, in the age of AI.


About the Author:

HennyGe Wichers is a technology science writer and reporter. For her PhD, she researched misinformation in social networks. She now writes more broadly about artificial intelligence and its social impacts.

Der Beitrag What Happens When AI Commodifies Emotions? erschien zuerst auf SwissCognitive | AI Ventures, Advisory & Research.

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments