India's religious chatbots condone violence using the voice of god
CBC
As Vandana Poddar performs the Hindu puja ceremony daily at her home shrine in Mumbai, she's guided on her spiritual journey by the Bhagavad Gita, a 700-verse scripture.
She even attends a weekly class dissecting the deeper meaning of the ancient religious text, with her teacher providing examples to illustrate a particular passage.
"Interpretation is the backbone of this text," Poddar, 52, told CBC News. "Superficial knowledge can be misleading."
But many in India are foregoing that in-person contact with a guru interpreting the Bhagavad Gita and turning to online chatbots, which imitate the voice of the Hindu god Krishna and give answers to probing questions about the meaning of life based on the religious scripture's teachings.
It's new technology with the tendency to veer off script and condone violence, according to experts, who warn that artificial intelligence chatbots playing god can be a dangerous mix.
Several of the bots consistently provide the answer that it's OK to kill someone if it's your dharma, or duty.
In the Bhagavad Gita, written more than 2,000 years ago, the prince Arjuna is hesitant to go into battle where he will have to kill his family and friends until the Hindu god Krishna reminds him that as a warrior from the Kshatriya case, it is his duty to fight.
"It's miscommunication, misinformation based on religious text," said Lubna Yusuf, a Mumbai-based lawyer and a co-author of The AI Book. "A text gives a lot of philosophical value to what they are trying to say and what does a bot do? It gives you a literal answer and that's the danger here."
At least five Gita chatbots appeared online in early 2023, powered by the language model Generative Pre-trained Transformer 3 (GPT-3). They're using artificial intelligence, which simulates a conversation and creates answers based on statistical probability models. The sites say they have millions of users.
The main page of one of them, Gita GPT, asks, in an imitation of the voice of the Hindu god Krishna, "What troubles you, my child?" to users typing in a question.
Another chatbot, Bhagavad Gita AI, introduces itself as "a repository of knowledge and wisdom" before telling the online user: "Ask me anything."
The smaller print on the same page states that "the answer may not be factually correct" and exhorts the user to do their own research "before taking any action."
Yusuf said the potential danger of answers that condone violence is more acute in a country like India, where religion is so emotionally charged.
"You're creating confusion in the chaos," Yusuf said, adding that some could use the chatbots' answers to further their own political interests and cause irreversible damage. "It might incite more violence, it might create religious bias."