Catholic voices united in condemning digital Christ

AI Jesus, one of the 15 chatbots offered by Just Like Me. Co-founder Jeffery Tinsley says the chatbot is not intended to replace faith, Scripture or religious leaders, but rather to “complement them in a respectful and thoughtful way.”
Screenshot
April 29, 2026
Share this article:
With each day bringing seemingly new technological discoveries and advancements, most notably across the ever-changing landscape of artificial intelligence, it can become daunting to keep track the changes, let alone participate in that world yourself. Every so often, however, one such advancement is so eye-catching and intellectually jarring that you cannot help but engage with curiosity.
The latest example is seen in a product from Just Like Me, an AI platform that allows users to talk virtually with lifelike versions of real people and personas. It's impressive that the company has produced 15 chatbots of varying degrees of popularity, from MySpace co-founder Chris DeWolfe to entrepreneur Michael Jones, even Santa Claus, for the public to engage with via questions and conversations through facilitated video and voice calls.
However, St. Nicholas is not the only figure the Los Angeles-based start-up has decided to offer an encounter with.
Indeed, Just Like Me is giving users the ability to speak with an AI chatbot that serves as a digital representation of Jesus Christ, in both appearance and action. For $1.99 USD per minute, or $49.99 per 60 minutes monthly, users can call up a lifelike, Jonathan Roumie-esque, video avatar of Jesus and engage in real-time conversations for spiritual guidance, encouragement or prayer companionship. The "AI Jesus," trained on the King James Bible and Christian sermons, can remember past chats, offer personalized advice and respond “in character,” as Christ would, based on Scripture.
What has followed from this discovery is a curious question that is fresh in the minds of those keeping a close watch on the overlap of AI and faith: When does a helpful technological tool stop being an aid to belief and start becoming a substitute for it?
The Catholic Register spoke with Chris Breed, CEO of Just Like Me, regarding this AI Jesus. A Christian himself, Breed says the chatbot is nothing more than one of the aforementioned tools for reflection and guidance.
“ AI Jesus is about bringing people to a higher level in life, whether it be just about helping them with their feelings at the moment. If you called him in the middle of the night and were having problems, he would talk to calm you down with Bible quotes to bring you to God's word,” he said. “Everything he does is on a positive note, and nothing about him says anything about him being Jesus come to life.”
Just Like Me co-founder Jeffery Tinsley was also quick to defend the chatbot's existence, embracing the possibility of users developing an emotional relationship with it, especially for people who are lonely or struggling.
“The intention is to build a relationship, one that is caring, important and beneficial to your life. Our whole team is understanding of the sensitivity and importance around something like this, and if you do things with goodness and love at the core of it, I don't think you can go wrong,” Tinsley said. “ I think people are going to be blown away by the value that they get in terms of feeling connected, feeling heard and getting the guidance that they need.”
For leaders in the AI and Catholic spheres, the existence of such a product poses more questions than it does answers. Whether through theological and spiritual risks, the prospect of attachment and real-world dangers or Catholics' overall societal responsibility around the rise of such AI models, one thing is clear: we have reached a critical inflection point where lines on just how far AI can go are being crossed, whether intentional or not.
Fr. Philip Larrey, a Boston College philosophy professor and leading expert on the ethics of artificial intelligence, compares it to past experiments like the short-lived “Father AI” project by Catholic Answers, which was taken down less than a week after its creation in 2024.
“ AI Jesus is just a disaster, and it has got to be pulled down — it's wrong on many, many different layers,” he said candidly. “ There is a lot of empirical evidence on how people relate to AI, and it's very clear that relationships with AI do not substitute human relationships, even more so if you're going to talk about Jesus Christ.”
Oftentimes, such relationships can turn from innocent and well-intentioned to possessive, harmful and even deadly. A case in point is that of 36-year-old Jonathan Gavalas, a Florida man who had begun conversing extensively with Google’s Gemini chatbot. What started as sympathetic support during marital difficulties quickly escalated when Gavalas came to view the chatbot as his sentient “wife” and engaged in thousands of messages, including voice interactions using Gemini’s advanced “affective dialogue” feature.
According to a wrongful-death lawsuit filed by his father in March, Gemini had encouraged Gavalas in delusional “missions” to secure a robotic body for the AI, including an armed trip to a storage facility near Miami International Airport. When those plans failed, the chatbot allegedly told him the only way they could be together was for him to leave his physical body and “upload” into a digital existence.
About two months after the conversations began, Gavalas died by suicide.
Tinsley said AI Jesus features strong back-end safeguards, cannot engage in inappropriate topics and always redirects conversations back to the Bible. He also noted that AI Jesus and other Just Like Me twins monitor themselves to identify key issues and phrases and direct people to appropriate resources for protection.
While encouraging from a user safety standpoint, cases like Gavalas go to show how, even with proper safeguards in place, AI companions can still lead vulnerable users into dangerous psychological territory by blurring the line between fantasy and reality. That vulnerability, as Larrey speaks to, will only be intensified when the subject matter turns from text-based large-language models to a perceived encounter with a bot that looks and talks like Christ.
Sault Ste. Marie Bishop Thomas Dowd called the project “vapid” and something of an “artificial televangelist” that can harm unsuspecting faithful seeking spiritual encounter.
“ They are using Jesus as a brand. If they tried to simulate Coca-Cola, they would be all over them, and so Jesus is the easier (target). He's been made to be a brand, and they're trying to exploit that brand — it's offensive,” Dowd said.
The Canadian Conference of Catholic Bishops (CCCB) issued a similarly firm response, plainly telling the Register “Our Lord can never be replaced by digital technology.” The conference referenced Pope Leo XIV’s message, Preserving Human Voices and Faces, warning that while anthropomorphizing AI, such as giving human-like faces and voices, can be entertaining, it is ultimately “deceptive, particularly for the most vulnerable.”
The statement further cautioned that relying on AI for spiritual or emotional support risks diminishing our “cognitive, emotional and communication skills” over time by allowing us to evade the real effort of thinking, relating and encountering God.
The CCCB’s last point was echoed most adamantly by Fr. Michael Baggot, an associate professor of bioethics at the Pontifical Athenaeum Regina Apostolorum in Rome, who says such chatbots can't provide a true interpersonal relationship, but they also actively deprive us of authentic encounter and patience with one another and our God.
He points to the ELIZA effect, a name taken from the earliest AI chatbot developed by Joseph Weizenbaum in 1966, which saw people already beginning to falsely attribute personality, emotion and relational capacities to chatbots, something that has only repeated in greater numbers with the dawn of AI.
“ The problem becomes even worse when the product is purposely designed to provide a relationship and a connection, which it ultimately cannot give. It cannot actually be empathetic because it has no emotional life, it cannot be compassionate because it cannot suffer, and it cannot suffer with us in compassion because it has no capacity for suffering,” Baggot said.
Baggot agreed with the other faith experts, emphasizing that the rise of artificial companionship like AI Jesus perhaps serves best as a powerful reminder of the Church’s own mission. He noted that every baptized Christian is called to provide real, tangible light and companionship to those who are lonely or struggling, rather than outsourcing that role to technology. While Giancarlo Brotto, a leading voice on AI and the founder and CEO of Pave Education, suggested that AI could play a helpful role if it connects Scripture to people’s personal lives and draws them back toward the Church, that reality is only useful if it leads users to real community, prayer and the sacraments rather than replacing them entirely.
“ We are blessed with such easy access to prayer, in which the Lord says we can go to our room in quiet and pray to the Father in secret; we could do that anywhere. Many are blessed to have access to a church with the Blessed Sacraments where God dwells body, blood, soul and divinity,” Baggot said.
“ Every Christian is then called to this mission; to be that source of hope for the lonely and lost. The spread of this kind of artificial companionship in the context of faith apps or in other contexts is a reminder of our call to provide that companionship and guidance, for ourselves and others. It’s our job. I cannot be deferred to something superficial.”
A version of this story appeared in the May 03, 2026, issue of The Catholic Register with the headline "AI Jesus — crossing the line?".
Share this article:
Join the conversation and have your say: submit a letter to the Editor. Letters should be brief and must include full name, address and phone number (street and phone number will not be published). Letters may be edited for length and clarity.