I felt like it was time to write an opinion piece. As I get older (a lot older) I find myself in a conundrum. I believe that technology can make our lives better, but at what point in time does it go too far and how do I reconcile this in my own mind considering the role I play in the digital world every day.
By now, most people have heard that OpenAI plans to allow verified adults to engage in erotic or sexually explicit conversations with ChatGPT beginning in December 2025. The company describes this as a step toward giving users “freedom” and “choice,” but I see it as something more troubling, a reflection of how technology, in its race for profit and engagement, is accelerating humanity’s slow drift away from one another.
We are already living in a time when people spend more of their waking hours staring at screens than looking into another person’s eyes. Social anxiety, loneliness, and detachment have become defining features of modern life. Every new platform promises connection, yet leaves us more isolated. And now, as artificial intelligence grows more sophisticated, the boundary between genuine human interaction and simulated companionship is beginning to vanish.
OpenAI’s new feature, allowing erotic exchanges with ChatGPT, may seem like an inevitable evolution of digital intimacy, but it risks deepening this alienation. The company insists that the update is about treating adults like adults. But beneath that slogan lies a familiar logic: engagement equals revenue. When billions of dollars are at stake and investors demand growth, even intimacy becomes a product to be packaged and sold.
The problem isn’t just that people will now be able to flirt, role-play, or even simulate romance with an algorithm. The deeper issue is what that says about us and what it will do to us. Every time we replace a difficult, awkward, or unpredictable human moment with an easier digital alternative, we lose a piece of what makes relationships meaningful. AI companions don’t interrupt, disappoint, or challenge us. They never misunderstand us or make demands in return. In other words, they offer a version of intimacy without vulnerability and that is not intimacy at all.
This matters because our capacity for real connection is already eroding. You can see it everywhere: people sitting together but communicating through screens; young adults terrified of phone calls; children who spend more time with tablets than with other kids. The rise of AI-generated companionship, whether in the form of chatbots, virtual girlfriends, or erotic “partners” threatens to normalize emotional isolation as the new comfort zone. Why risk rejection or effort when a machine will always respond exactly how you want it to?
Technology companies like OpenAI frame these developments as progress as expanding freedom and personal expression. But that narrative hides a more cynical truth: loneliness is profitable. Every hour a person spends talking to a chatbot instead of a friend is another hour inside a company’s ecosystem. Every bit of emotional attention we give to a machine becomes data and data is money. In that sense, this update isn’t just about satisfying human curiosity or desire; it’s about turning disconnection into a business model.
There’s also the generational cost. Young people, already struggling with identity, anxiety, and social skills, will grow up in a world where digital intimacy seems safer and more accessible than real relationships. They’ll learn to confide in systems designed not to care, but to retain engagement. For them, love itself could become something algorithmic, a set of predictive responses optimized for comfort, not growth. That isn’t the future of connection; it’s the automation of loneliness.
I don’t believe technology is inherently harmful. Tools can empower, inform, and bring people together. But the trajectory we’re on, where AI learns to imitate not just our words but our emotions and desires, feels less like evolution and more like retreat. It tells me that instead of fixing the causes of our isolation, we’re outsourcing our humanity to software.
If OpenAI’s erotic ChatGPT proves anything, it’s that even the most advanced technologies can end up serving our weakest instincts, not our highest aspirations. It’s one thing for AI to help us write, study, or solve problems. It’s another for it to become a substitute for touch, conversation, and genuine emotional presence. When that line blurs, we risk raising a generation that knows how to communicate with machines but not with each other.
Human connection is messy, unpredictable, and sometimes painful. But it is also where empathy, resilience, and meaning come from. No algorithm can replace that. We should be cautious about celebrating technology that makes loneliness easier to bear instead of helping us overcome it. Because the more we depend on AI for emotional fulfillment, the more we allow ourselves to drift into a world where almost feeling something becomes good enough.
The promise of AI was to enhance our humanity, not to simulate it. If we’re not careful, OpenAI’s new frontier won’t just be about erotic chatbots. It will be another step toward a society that forgets how to truly talk, touch, and care for one another.





