ĚÇĐÄvlogąŮÍř

By Maria Kuznetsova

Replika AI Companion app.
Someone using Replika, an AI Companion app.

The views expressed below are those of the author and do not necessarily reflect those of the Carr Center for Human Rights Policy or Harvard Kennedy School. These perspectives have been presented to encourage debate on important public policy challenges.

If you have not heard yet—AI companions are the big new thing in town. Nearly 1 in 3 men under 30 and 1 in 4 women under 30 in the United States have interacted with an AI-generated romantic or sexual partner. . While much attention has been given to gender biases in mainstream AI chatbots, far less public scrutiny is placed on apps that create AI girlfriends and boyfriends. But could this virtual fantasy lead to real-world consequences—especially for women?

AI is not just text anymore. It now speaks, generates images, and holds conversations so realistic that many cannot distinguish them from human interactions. While some use AI for companionship, others turn to it for romantic or sexual fulfillment, engaging in roleplay or even consuming AI-generated pornography.

So, what is the appeal? AI never argues, never rejects, and becomes exactly who you want it to be. In fact, 42% of respondents agree that AIs are easier to talk to than humans . Chatbots can feel incredibly real, with users forming deep emotional bonds with them. People can start projecting their interactions with AI onto real-life—blurring the boundary between artificial and human intimacy. (And yes, I realize I am anthropomorphizing technology here.)

Although both men and women use AI dating apps, men disproportionally dominate the paid market, and the industry overwhelmingly targets them–especially when it comes to sexual content. AI "girlfriends" are marketed with hyper-sexualized appearances and unrealistic beauty standards, creating a fantasy world that even professional pornography editing could never have dreamed of.

One of the biggest draws? Customizability. 41% of responders say they are attracted to the ability to design their ideal partner . The most common AI girlfriends profiles emphasize submissive traits, creating the perfect, agreeable companion—one that is never tired, never busy, never in a bad mood. This raises serious concerns: If AI always adapts to a user's desires, it reinforces on-demand gratification – and discourages personal growth. Why compromise or engage in difficult conversations when a "perfect" AI is available at all times? At a moment when polarization is already impacting our politics, AI could make it even harder to engage in real-world dialogue across differences.

When violence becomes acceptable in a virtual space, it can normalize similar behavior toward real people.

In this digital fantasy, consent is nonexistent—because AI, by design, can never say no. This could lead some users to practice violating boundaries, engaging in non-consensual or even violent behavior that real women would reject. But aggression without real-life consequences is still aggression. When violence becomes acceptable in a virtual space, it can normalize similar behavior toward real people. 

We know that the rise of internet pornography has led to increased sexual violence and objectification of women. AI dating could be pornography on steroids. Unless mainstream AI companies face societal pressure to put guardrails on their models, or tech giants like Apple and Google regulate apps they allow to be on their platforms, this problem will only grow. Although open-source tools allow anyone to build custom AI companions – preventing at least wide-spread acceptance of violent practices is crucial.

So, rising polarization in America – including along gender lines –risks being further intensified by this emerging technology.

This AI revolution is happening at a dangerous moment in the United States—when women's rights are under threat and there is no hope for strict AI regulation. While figures like Melania Trump have supported legislation against deepfake pornography—perhaps one small step forward—broader protections are unlikely to come. So, rising polarization in America – including along gender lines –risks being further intensified by this emerging technology.

 

Maria Kuznetsova, Human rights expert from Russia; Fellow, Carr Center

Read Next Post
View All Blog Posts