The Italian Data Protection Authority (IDPA) confirmed the ban on ChatGPT late last month, but clarified that the prohibition is temporary. Once OpenAI complies with the European Union's General Data Protection Regulation (GDPR), the ban will be lifted. The GDPR is a privacy law in the EU that protects individuals' fundamental rights to data protection.
IDPA said on March 20 that ChatGPT had experienced a data breach involving user conversations and payment information. Following the breach, an investigation into OpenAI open was launched "with immediate effect."
According to the regulator, there was no legal basis for OpenAI to justify "the mass collection and storage of personal data for the purpose of 'training' the algorithms underlying the operation of the platform."
It also accused ChatGPT of failing to check the age of its users. Only people above the age of 13 are supposed to be allowed to access the AI platform, but there is no way for it to verify how old people are. Given this, the IDPA pointed out that ChatGPT "exposes minors to absolutely unsuitable answers compared to their degree of development and awareness."
The regulator gave OpenAI 20 days to respond to how it would address the concerns regarding ChatGPT. Failure to respond would subject the San Francisco-based OpenAI to a fine of either €20 million ($21.68 million) or up to four percent of its annual revenues.
Italy's move to ban ChatGPT makes it the first Western country to do so, according to the BBC. Other countries that have blocked the AI platform include China, Iran, North Korea and Russia. (Related: CCP blocks ChatGPT: Party officials fear chatbot will spread American propaganda online.)
OpenAI did not respond to requests for comment sent by the BBC and the Semafor news website.
IDPA is not the only data protection authority that moved against ChatGPT. The Irish Data Protection Commission told the BBC that it is following up with its Italian counterpart to understand the basis for its action. The regulator added that it will "coordinate with all EU data protection authorities" in connection to the ban.
The Information Commissioner's Office (ICO) said that while it supports developments in AI, it is also ready to "challenge non-compliance" with data protection laws. The ICO handles data privacy regulations in the United Kingdom.
Dan Morgan from cybersecurity rating provider SecurityScorecard said the ban on ChatGPT shows the importance of regulatory compliance for companies operating in Europe.
"Compliance with regulations is not an optional extra," said Morgan, who is the senior director for European government affairs at the company. "Businesses must prioritize the protection of personal data and comply with the stringent data protection regulations set by the European Union."
Consumer advocacy group BEUC also called on European authorities to investigate ChatGPT and similar chatbots. The group's deputy director-general Ursula Pachl warned that society was "currently not protected enough" from the harm that AI can cause.
"There are serious concerns growing about how ChatGPT and similar chatbots might deceive and manipulate people," she warned. "These AI systems need greater public scrutiny, and public authorities must reassert control over them."
Since its launch in November 2022, millions of people have used ChatGPT for various purposes. It can answer questions using human-like language and mimic other writing styles, using the internet in 2021 as its database. Technology giant Microsoft has invested in the AI chatbot, incorporating it into its Bing search engine and announcing plans to include a version on it in Microsoft Office.
However, concerns have arisen about ChatGPT – including its threat to jobs and its potential to spread misinformation and bias. True enough, Europol – the EU's law enforcement agency – expressed concern about the spread of disinformation when data through the chatbot is processed inaccurately.
Watch Brother Nathanael Kapner's warning against ChatGPT below.
This video is from the jonastheprophet channel on Brighteon.com.