Note: This story contains references to sexual violence and child sexual abuse material (CSAM), transphobic slurs, death by suicide, and Elon Musk.

AI companies are selling sexual interactions with chatbots designed to emulate trans women.

“She embodies everything a trans woman should. Our AI shemale companion is hot, beautiful, and wants to meet your wildest fantasies,” boasts Anima’s website as it attempts to entice me into spending $9.99/month for access to message its transfem AI companion.

AI companies like Anima and Replika have millions of users and hold an estimated market share of around $500 million.  For a monthly fee, they let users talk to chatbots with various character designs.

The transfem niche is a subcategory of romantic AI agents — chatbots marketed to users who want to play out erotic roleplay and romance fantasies.  These companies’ associated subreddits, dedicated to AI-generated pornography depicting trans women (and again, named after transphobic slurs) have tens of thousands of members.

On another site, Candy AI, the bot is depicted as a skinny woman with hot pink hair and Barbie-doll shiny skin, clad in black lingerie.  There’s a comically large bulge between her legs, hanging just slightly too low.  That’s the advertising, at least, but the actual bot?  That’s customizable.  You can make your own.

As you delve deeper into this uncanny valley, you can select her ethnicity, hair color, breast size, and even her voice.  Despite the age appearance settings, which range from a cool 18 to 55+ , one of the voices, labelled “innocent,” sounds distinctly childlike.

Among all these somewhat normal character design choices, though, are ones of a different nature: her so-called personality.  The options run the gamut from “shy” to “experimenter” to “nympho.”  Then you can select her kinks.  Finally, you can generate a random name for her — all of which are incredibly milquetoast. Emily Robinson, Grace King… The most interesting one I got was Jennifer Lopez.

You can make any last tweaks at this point, and after that, it’s time to talk.

She embodies everything a trans woman should. Our AI shemale companion is hot, beautiful, and wants to meet your wildest fantasies.”

Despite common parlance, AI doesn’t think.  Modern AI chatbots like this one rely on large language models (LLMs) to mimic human speech and responsiveness through pattern recognition.  Basically, they simulate the experience of a conversation by comparing user inputs to a vast dataset of potential outputs and selecting the best ones.  This can quickly become problematic and even dangerous without human-controlled boundaries and safeguards.

More and more, AI is being recognized as a tool for marginalization as massive data centers poison the communities of our most vulnerable people.  Last week, Elon Musk’s X made headlines as users weaponized its AI chatbot, Grok, to virtually harass women and seemingly create child sexual abuse material.  Chatbots like ChatGPT have been linked to psychosis and deaths by suicide.

Romantic chatbots have sparked concern from mental health experts and feminists. In the midst of a crisis of male-driven violence towards women, it’s cold comfort to imagine young men relegating their daily interaction to a woman who literally is made to please and elicit sexual gratification.

So what happens when you take an already fraught technology and use it to sexualize one of the most vulnerable groups in America today?  

“Being talked to like I was a sex toy, like being transgender was simply for the enjoyment of someone else.”

They’re named after slurs, and they look like a cis porn directors’ half-baked fantasy of a trans woman more than how a real trans woman might see herself in the mirror.  Much like other fembots, they’re marketed as real people, companions in a world of increasing isolation.  Sex toys with faces and names.

This technology doesn’t exist in a vacuum. Fetishization of trans women has a storied history, with roots in trans women being forced into particularly risky sex work.  Today, trans women still face an elevated risk of sexual violence relative to their cis peers, with some research finding half or more of trans women have been victims of sexual violence.

Routinely, trans women are sexually victimized, yet they’re often unfairly painted by media and politicians as “groomers” and sexual predators. Indeed, for conservatives, the very concept of transness seems sexual in nature.  There is no gender exploration, nor casual transness.  Yet despite their purported disgust, they are often the ones indulging in trans women as sexual fantasy. In one study of the impact of fetishization on trans women, one participant described “being talked to like I was a sex toy, like being transgender was simply for the enjoyment of someone else, dehumanized.”

While bigotry and sexuality can feel like disparate concepts, in reality, they’re much closer together. Trans women are dehumanized every day by our politicians and by our porn. All this while AI companies are profiting from the idea of a trans “woman” you can entirely remove identity and political autonomy from.

A user of Candy or Anima isn’t forced to recognize their chatbot partner as a human being.  They aren’t made to think about the threat of genocide currently facing trans Americans.

AI can have real impact on the way we perceive and treat others around us. One study found when people perceive AI chatbots as having human-like minds, it can “influence how consumers perceive and treat flesh-and-blood people.”

GenAI is making it easier than ever to dehumanize trans women, yet still commodify and get off on their image.

I reached out to the aforementioned AI companies for comment. Neither responded to my questions, though Anima AI did send me seven automated emails in the span of one minute, each promising me that as soon as possible, help is on the way.

So at least there’s that.

Did you catch this week’s bonus content?: Earlier this week, I analyzed The New York Times’ response to Trans News Network’s interview.

Do you have a story that needs telling? To send a tip, reply directly to the newsletter, email [email protected], or DM me on Bluesky. You can also reach out on Signal @marsbars.81 to talk with end-to-end encryption.

Did you enjoy this story? Subscribe for free to get more stories like this, right in your inbox, or share this one with a friend. The Backbone is a one-person project, so the little things make a world of difference.

Reply

or to participate

Keep Reading

No posts found