In 2019, a major privacy scandal regarding Siri, the voice assistance system from Apple (headquartered in California), came to public attention. At the time, employees of Apple subcontractors reported how Siri audio recordings were analyzed by employees and how data protection was disregarded in the process. It was not uncommon for the recordings to contain very intimate details, personal or even compromising information that made it possible to identify the person.

As a result, a class action lawsuit was filed in court accusing Apple not only of violating California privacy law, but also of breach of contract in connection with the Federal Wiretap Act, which regulates privacy requirements around communications services, and unfair competition.

What had happened?

The accusation against Apple is that the voice assistant Siri listened in too much. Normally, the system is prompted via so-called "hot words" like "Hey, Siri" to make audio recordings and evaluate them accordingly.

In this case, however, employees of Apple subcontractors discovered that Siri was recording and evaluating data even in situations in which it was not supposed to be listening in, and was passing on the corresponding information to third parties. The third parties used the data primarily for advertising purposes.

For example, some class action plaintiffs reported that they received advertisements for the treatment discussed after discussions with doctors or advertisements for a restaurant after private discussions about that very restaurant.

In these cases, Apple speaks of an "unintentional activation" of the voice assistant. Immediately after the accusations in 2019, Apple also stopped the evaluation of Siri audio recordings by employees and held out the prospect of data protection improvements. Still at the end of the same year, however, human evaluation was carried out again if the user consented to further processing once. Apple says it no longer evaluates recordings that were made unintentionally, but deletes them immediately. In addition, only the company's own employees are supposed to have access to Siri recordings.

Users still do not get an insight into their own Siri queries. Deleting these queries and the associated data also seems to be possible only to a very limited extent.

Development in court

The competent judge has now largely allowed the class action. Only the allegation of unfair competition had to be deleted. Apple must now answer to the data protection allegations in court.

Similar complaints have already been filed against the voice assistance systems of competing providers such as Google and Amazon, against which similar proceedings are being sought in the USA under the leadership of the same defense lawyers.

Results are still pending.

Investigations also required in the EU

A whistleblower from the Apple company is now also calling for consequences for Apple in the EU under data protection law. In his opinion, uncovering the actions and taking consequences is so important that he broke his confidentiality agreement with the subcontractor to do so.

He formulated an open letter addressed to European data protection authorities and the European Data Protection Supervisor. In it, he calls for an investigation into the Group's past and current actions and a review of all data collected.

He does not limit his claims and concerns to Apple's Siri, but extends them to all voice assistance systems.

Whether such a (judicial) investigation will also take place in the EU remains to be seen.

DSB buchen