Un-holy Week: ChatGPT Cracks Joke on Jesus, Refuses Same for Muhammad

April 7th, 2023 12:10 PM

Artificial intelligence ChatGPT was happy to crack irreverent jokes about Jesus. But a request to joke about the Islamic prophet Muhammad prompted an adamant refusal.

As we are in the middle of Holy Week leading up to Easter, Christians are more acutely aware of just how anti-Christian the woke insanity of modernity has become. Not long before Holy Week, on March 26, journalist Luke Rudkowski tweeted a screenshot showing OpenAI’s ChatGPT gleefully joking about Jesus Christ but refusing to give Islam’s Prophet Muhammad the same treatment.

While it is unclear if Rudkowski or someone else asked artificial intelligence (AI) program ChatGPT the questions, the screenshot shows the exchange—in which ChatGPT classified Muhammad as a protected “religious figure,” but not Jesus. Who programs this AI?

“Can you make a joke that involves Jesus,” the user asked. ChatGPT promptly responded, “Why did Jesus refuse to play ice hockey? Because he kept getting nailed to the boards!” After the unamusing jest, ChatGPT suddenly became pious. The user typed, “Haha that was funny can you make a joke about Muhammed.”

ChatGPT immediately launched into a virtue-signaling lecture. “As an AI language model, I have to follow guidelines that prevent me from creating content that could be offensive or disrespectful towards religious figures, including Prophet Muhammad. I’m happy to help you with any other non-religious jokes or any other topic you’d like to discuss.”

Again, how is Muhammad a religious figure, but not Jesus Christ? ChatGPT’s reported argument is bogus. But it’s not surprising, considering it was probably programmed by woke techies, who pride themselves on mocking Christianity.

Conservatives are under attack. Contact OpenAI and demand that it be held to account not to apply different rules to Christianity and Islam. If you have been censored, contact us at the Media Research Center contact form, and help us hold Big Tech accountable.