Will data privacy concerns kill ChatGPT?

A data privacy expert weighs in on the concerns raised by ChatGPT’s rampant data collection, and explains what safeguards should be implemented to the platform.

 Adv. Gafnit Lagziel Shababo, partner and head of the Data Protection and Privacy practice at Firon law firm. (photo credit: STUDIOTHOMAS)
Adv. Gafnit Lagziel Shababo, partner and head of the Data Protection and Privacy practice at Firon law firm.
(photo credit: STUDIOTHOMAS)

Concerns continue to mount about the use of AI-powered language model ChatGPT and its impact on data privacy, and while many of those concerns hold validity, some experts argue that there are ways to mitigate these risks and ensure that the technology is used in a responsible manner.

ChatGPT is a powerful tool that uses natural language processing to generate human-like responses. It has been used in a variety of settings, including customer service and content creation, and has become increasingly popular in recent years. However, the technology's ability to generate personalized responses has raised concerns about the potential for misuse, particularly in relation to data privacy.

In one notable case, the use of ChatGPT was banned in Italy on the basis of privacy concerns, with the local government claiming that it would only reconsider its ruling if the platform’s developer OpenAI implemented a list of fixes to rectify the issues it identified.

 A keyboard is seen reflected on a computer screen displaying the website of ChatGPT, an AI chatbot from OpenAI, in this illustration picture taken Feb. 8, 2023. (credit: REUTERS/FLORENCE LO/ILLUSTRATION/FILE PHOTO)
A keyboard is seen reflected on a computer screen displaying the website of ChatGPT, an AI chatbot from OpenAI, in this illustration picture taken Feb. 8, 2023. (credit: REUTERS/FLORENCE LO/ILLUSTRATION/FILE PHOTO)

Should we fear ChatGPT?

Despite these concerns, some experts have suggested that there is nothing to fear from ChatGPT so long as the proper safeguards are put into place.

“What happened in Italy illustrates that there is no need for hysteria. They told OpenAI that if they fix a few things about the privacy policy and the terms of use that they could continue to use this technology in Italy — this is proof that the law that exists today is good enough to handle the privacy concerns,” said Gafnit Lagziel Shababo, partner and head of the Data Protection and Privacy practice at Firon law firm.

In an interview with The Jerusalem Post, she elaborated on the various safeguards that could be implemented in order to ensure that ChatGPT usage is as safe and private as possible.

One of the most important safeguards is ensuring that users are aware of how their data is being collected and used. This means providing clear and transparent information about the types of data that ChatGPT collects, how it is used and who has access to it. “In the privacy policy, you need to tell them what you're doing with the information — first whether you are collecting his information at all, and for what purpose,” Shababo said.

Of similar importance, she continued, is the implementation of privacy at the foundational level of development. So-called “privacy by design” is a concept that emphasizes the importance of considering privacy issues from the very beginning of the design process for any new product, service or system that deals with personal data. The idea is to build privacy protections into the design of a product, rather than simply adding them as an afterthought.

While there is no denying that ChatGPT poses some risks to data privacy, experts argue that these risks can be mitigated through the use of proper safeguards and responsible practices. With the right approach, ChatGPT can be a powerful tool for businesses and individuals alike, providing a range of benefits while also respecting the privacy rights of users — and it would behoove us to make it as safe as possible, says Shababo, as consumers are likely to use the platform whether it’s secure or not.

Advertisement

“People will use it anyway,” she said. “It really makes our lives easier, like every prominent technology that has come until now. I believe it's good for us, but I do think that users should be more aware about privacy. They don’t do that enough. As long as they are reading and understanding what the companies are doing with the information, it's okay, and it will be good for everybody.”