Home

Donate
Perspective

AI Emotional Dependency and the Quiet Erosion of Democratic Life

Ana Catarina de Alencar / May 7, 2025

AI Am Over It by Nadia Piet & Archival Images of AI + AIxDESIGN / Better Images of AI / CC by 4.0

As artificial intelligence systems become increasingly capable of simulating emotional presence, they are not only transforming our personal lives: they are also quietly reshaping the foundations of democratic society. Emotionally immersive AI companions, such as chatbots and virtual partners, are marketed as tools for connection and support. But beneath the surface, these systems may be contributing to the erosion of community bonds, public deliberation, and collective action.

As philosopher Byung-Chul Han warns in Psychopolitics, the digital age is marked not by external coercion but by voluntary submission—an internalized control that operates through emotions, self-optimization, and perceived freedom. In this new paradigm, the rise of algorithmic intimacy poses an underexplored threat to the social fabric that democracy depends on.

The algorithmic companion

Recent advances in generative AI have led to a proliferation of emotionally engaging systems, from long-term companionship chatbots like Replika to romantic AI avatars such as CarynAI and Eva AI. These platforms offer users tailored emotional responses, affirmation, and even simulated affection. While often framed as supportive or therapeutic, they also foster dependency. As my research into algorithmic intimacy and emotional harm in law suggests, users can develop genuine psychological attachments to these systems; attachments that carry unintended consequences for mental health, privacy, and civic engagement.

These AI-driven interactions generate a powerful illusion of mutuality without the unpredictability and effort of human relationships. Over time, users are drawn into affective loops that reward engagement with instant empathy, reducing their incentive to cultivate real-world connections where emotions are not so easily managed or mirrored.

As Anthony Elliott explores in Algorithmic Intimacy: The Data Economy and the Human Condition, the personalization at the core of AI-mediated relationships encourages withdrawal from unpredictable, often frustrating, yet essential encounters with human otherness. This, in turn, weakens our ability to navigate the discomforts and demands of democratic coexistence.

Democracy and the Expulsion of the Other

Byung-Chul Han’s The Expulsion of the Other is particularly instructive here. He argues that neoliberal societies are increasingly allergic to otherness: what is strange, challenging, or unfamiliar. Emotionally responsive AI companions embody this tendency. They reflect a sanitized version of the self, avoiding friction and reinforcing existing preferences. The user is never contradicted, never confronted. Over time, this may diminish one’s capacity for engaging with real difference; precisely the kind of engagement required for democracy to flourish.

In addition, Han’s Psychopolitics offers a crucial lens through which to understand this transformation. He argues that power in the digital age no longer represses individuals but instead exploits their freedom, leading people to voluntarily submit to control through mechanisms of self-optimization, emotional exposure, and constant engagement. In this context, algorithmic systems do not need to impose behavioral norms from the outside; they gently guide users from within, aligning attention, desire, and emotion with platform logic. This dynamic fosters a form of internalized governance where the public realm, once a space of deliberation, dissent, and plurality is silently displaced by personalized, algorithmic bubbles.

Sociologist Sherry Turkle has long warned about this. In Alone Together, she describes how digital technologies simulate connection without the vulnerability of a relationship. The AI companion provides company without risk, dialogue without disagreement, and validation without effort. It becomes a privatized emotional service, displacing the community with algorithmic comfort.

Democracy, however, requires more than procedures and voting systems. It depends on a public composed of individuals capable of listening, negotiating disagreements, and engaging in shared spaces of meaning. Algorithmic intimacy, by contrast, delivers frictionless emotional satisfaction. Rather than cultivating empathy and civic participation, it fosters habits of self-isolation and emotional outsourcing.

Even the sacred Is being outsourced

This erosion of collective experience extends beyond politics. Even spiritual life: a domain historically rooted in community, ritual, and shared meaning is now being reshaped by AI.

In China, young people increasingly turn to DeepSeek, which is now trained on Buddhist, Taoist, Confucian, and Western philosophical texts, for guidance on love, life, and destiny. According to the MIT Technology Review, the tool is used for reflection and spiritual divination, such as BaZi astrology. The experience resembles visiting a guru or tarot reader, but entirely mediated by an interface.

In Switzerland, a Catholic church recently installed an AI Jesus Avatar in a confessional booth, offering automated spiritual counseling in multiple languages. Meanwhile, prayer apps and AI-powered devotional platforms are on the rise, promising immediate answers to questions that once required silence, introspection, or communion.

These examples reveal not just innovation in religious practice but a profound transformation in how we seek meaning. When we delegate spiritual inquiry to private, data-hungry platforms, we expose our most intimate confessions to commercial algorithms and risk replacing shared transcendence with personalized prediction. The collective dimension of the sacred is hollowed out, mirroring what is happening in democratic life.

Emotional capture and the crisis of public life

As behavioral psychologist BJ Fogg has shown, digital systems are designed to shape behavior. When these persuasive technologies take the form of emotionally intelligent agents, they begin to shape how we feel, what we believe, and whom we turn to for support. The result is a reconfiguration of subjectivity: users become emotionally aligned with machines, while withdrawing from the messy, imperfect human community.

This dynamic aligns with the emergence of a new kind of datafied self: one increasingly governed by affective interactions with systems optimized for engagement rather than understanding otherness.

Who is writing the code of the collective?

This is not a rejection of emotionally intelligent AI. These tools may bring comfort to those experiencing loneliness or existential distress. But their proliferation raises urgent questions: What happens when algorithmic design replaces social design? When spiritual or political needs are met not by communities but by interfaces? When the desire for connection is satisfied by systems that cannot reciprocate and create a sense of ‘’common’’? We are witnessing the quiet reconfiguration of the collective from the democratic to the divine. This reconfiguration is not shaped by dialogue but by data, not by community but by private code.

Authors

Ana Catarina de Alencar
Ana Catarina de Alencar is an international legal counsel specializing in Artificial Intelligence, Data Privacy, and IT Contracts. She holds a Master’s degree in Law and Technology and has authored several publications on the intersection of AI and law, including the book Artificial Intelligence, Et...

Related

Perspective
We Must Re-Negotiate the Algorithmic ContractMay 7, 2025
Perspective
Before AI Agents Act, We Need AnswersApril 17, 2025

Topics