The problems, solutions and tax aspects of AI smart assistants

It’s possible that the AI underpinning these machines should be trained to recognize the data that they are recording and to avoid recording unfamiliar voices.

Mike George, VP Alexa, Echo and Appstore for Amazon, speaks during the LG press conference at CES in Las Vegas (photo credit: REUTERS)
Mike George, VP Alexa, Echo and Appstore for Amazon, speaks during the LG press conference at CES in Las Vegas
(photo credit: REUTERS)
Tech giants Apple, Amazon and Google have changed the world with their creation of “smart assistants,” summoned by phrases such as “Hey Siri,” “Alexa” and “OK Google.” But we also read horror stories of Alexa being activated and recording conversations without having been summoned. We don’t like the idea of robots listening to our private conversations, but what if it materialized that human analysts are doing just that?
Problems
A few months ago, it emerged that these companies, as well as Facebook, have humans listening to (and leaking) snippets of our voice recordings and messages.
The purpose, they have said, is to improve artificial-intelligence, voice-recognition systems and enhance the customer experience.
Equally frightening is that the horror story referenced above turned out to be true, as some of the recordings that were analyzed by humans had been recorded by smart assistants on occasions where the customers had not summoned the assistant using its “wake-up” phrase.
In accordance with General Data Protection Regulation 2016/679 (GDPR), someone’s voice is considered their personal data because individuals can be identified by the sound of their voice.
For example, Scarlett Johansson’s voice was easily recognizable in the film Her. What’s more, random recordings mean that at some point these machines may be recording customers discussing other sensitive and personal information, such as their bank accounts or their health.
There have been some hurried statements published by the tech giants confirming that these recordings were anonymized before being handed over to analysts.
While this is possible for data related to location or a device’s serial number, it may be difficult (if not impossible) to anonymize the sound of a voice, or potentially sensitive personal data discussed by the individuals in the recordings.
Solutions
Having established the problem, we can begin to identify solutions. At a minimum, our voices should be distorted so as to avoid being identifiable. Alternatively, transcripts, instead of audio recordings, could be provided to the analysts.
Another threat lies in devices recording the audio of external third parties.
For example, although the owner of a smart assistant may have consented to recording his or her data, a bystander who happens to have a conversation with the owner of a smart assistant would not have consented to the recording of his/her data.
It’s possible that the AI underpinning these machines should be trained to recognize the data that they are recording and to avoid recording unfamiliar voices.
Or perhaps we could require the machine to ask the user for explicit consent to record information on each occasion, as is required under GDPR when processing sensitive data.
The Hamburg Data Protection Authority, a German privacy watchdog, ordered Google to cease this processing of personal data in order to “protect the rights and freedoms of data subjects” (Article 66 GDPR), a first in GDPR history. Note that Google had already suspended this activity in the EU.
This processing also caught the eye of the Irish Data Protection Commission and the National Commission for Data Protection in Luxembourg, which are taking a closer look into this serious issue. Nonetheless, it seems that no official complaints or sanctions have been issued by the member states of the European Union.
With respect to the tech giants’ responses, Apple has issued an apology, and Amazon said it will be adding an option to auto-delete voice recordings to their new smart clocks, smart ovens, high-end speakers, ear buds and glasses.
What is certain for now is that all four companies have hit the brakes on their legally questionable use of users’ personal data.
Tax aspects
On the tax side, data privacy is all important, or is it? Section 231 of the Israel Income Tax Ordinance requires tax officials to keep taxpayers’ information private. But the Israel Tax Authority uses detailed VAT reporting and big data to compare all reported purchases with all reported sales in Israel, with a view to detecting black income.
Tax treaties contain information-exchange provisions. The US FATCA and OECD common reporting standard (CRS) require banks and financial institutions to report customer balances and details to the tax authorities in the customers’ home countries. In the UK, where the CRS is well implemented, all that can “go wrong” is that the wrong Mr. Smith might face questions about an undisclosed foreign bank account.
In short, privacy is fast becoming a luxury, Alexa.
As always, consult experienced legal and tax advisers in each country at an early stage in specific cases.
Deborah Tastiel (Deborah.Tastiel@asserson.co.uk) is an intellectual property lawyer at Asserson, an English law firm in the heart of Tel Aviv.
Leon Harris (leon@h2cat.com) is a certified public accountant and tax specialist at Harris Horoviz Consulting & Tax Ltd.