As experts consider halting development of artificial intelligence to address concerns about how the technology could negatively impact humans, a new advocacy group is warning that AI could be weaponized to persecute Christians around the world.
“Christians and religious minorities are typically among the most vulnerable communities in many countries around the world, and the exploitation of new AI technologies could make things even worse for them,” David Curry, CEO of Global Christian Relief (GCR), said in a press release. “There is tremendous upside with AI, but also tremendous risk, especially at the speed AI is evolving. We need to slow down and think through how AI is being implemented. Otherwise, the consequences for persecuted Christians and others could be disastrous.”
GCR notes the prevalence of AI technology deployed in China, as the Chinese Communist Party uses more than 500 million street cameras and facial recognition technology to surveil its citizens to pair the data with its social credit score system.
“The misuse of AI could certainly spell the end of freedom for Christians and religious minorities around the world,” GCR states in a report detailing five ways in which AI could help fuel global persecution:
Surveillance and facial recognition
AI-powered surveillance and facial recognition cameras could be used to monitor individuals and groups attending church, according to GCR.
The group reports that Chinese-made facial recognition software is being used to track and arrest protestors in Myanmar, while in Iran, police will soon begin using smart cameras to identify and punish women who violate laws requiring them to wear a hijab.
Censorship and content filtering
Given the need for them to be coded and trained, GCP says that AI-powered services like ChatGPT can be susceptible to censorship by governments who want to target religious groups.
Malevolent government actors could alter search results and manipulate responses to not recommend going to church, not provide addresses to church, or tell an individual that going to church would lower their social credit score.
As advances are made in video, due to AI, fake videos of pastors or faith leaders could be produced to make it seem as though they said something “blasphemous or insulting, giving enemies a pretext for harassment, arrests, and violence,” GCR says in the report.
Additionally, “fictional churchgoers could be created to coax ‘fellow’ Christians into revealing personal information that can be used against them or divulge locations of secret underground churches,” GCR warns.
Police departments have used AI platforms to develop predictive policing algorithms to anticipate where crimes are likely to occur. But, as the report says, those algorithms are “often influenced by arrest rates, which can disproportionately impact minority communities. Police departments then double down on these communities, leading to over-policing.”
Hostile governments could weaponize AI technology to predict where Christians and religious minorities may meet for worship services, allowing police officers, government agents, or terrorists to target them for arrest, attacks, or death.
“Lethal Autonomous Weapon Systems use artificial intelligence to locate and destroy targets on their own while abiding by few regulations,” GCR’s report states. “These weapons are already dangerous enough, but pose an even greater threat when they fall into the wrong hands.”
Christians and religious minorities could face the possibility of improvised explosive devices (IEDs) being remotely delivered by aerial vehicles using AI technology, the group cautions.
GCR is calling on governments to pass legislation to regulate how AI technology is developed and deployed.