IBM Prohibits Siri Due to Privacy Breach Concerns

IBM bans more than just Siri since Dropbox and iCloud are unwanted in the company too.

Privacy was a big concern for Siri's engineers from the start, according to the main developer of the original Siri iPhone app, which Apple later bought.

Next to Dropbox and iCloud, IBM Officially Banned Siri

This week, International Business Machines (IBM) CIO Jeanette Horan told MIT's Technology Review that her business has officially forbidden Siri because it is concerned that the spoken commands may be recorded.

Horan is right to be concerned, as it turns out. In fact, Apple explicitly states this in its iPhone Software License Agreement: "When you use Siri or Dictation, the things you say will be captured and forwarded to Apple to be converted into text," Apple claims.

In order to improve its performance, Siri gathers a ton of additional data, including names from your contact book and other user-unidentified data.

In line with this, the American Civil Liberties Union has issued a warning regarding Siri because some of its data may be rather sensitive.

According to Edward Wrenbeck, the principal creator of the first Siri iPhone app, Apple subsequently purchased, privacy was a major issue for Siri's developers from the beginning. And there are considerably greater dangers for business users. He claims that even knowing you're at a particular customer's place may breach a non-disclosure agreement.

However, he acknowledges that other internet service providers share many of the concerns voiced by Apple's management of Siri data.

Siri is not the only helpful tool to be banned at IBM. Dropbox is unwelcome, as is Apple's iCloud. Some IBM employees even have restrictions placed on them when it comes to access to, say, internal IBM apps and files.

In the past, it seems, IBM was rather fond of the security BlackBerry offered. But now, a mere 40,000 BlackBerrys circulate among its 400,000 employees.

What is odd is that Horan seems to believe many employees don't understand the importance of security. She told Technology Review that many were "blissfully unaware" that certain apps might not be terribly secure.

In 2019 it was revealed that outside contractors are examining chats with Apple Siri. Worse yet, the voice assistant is simple to activate and sometimes picks up on private conversations, including those in which individuals speak to their doctors, transact on drugs, and have sex.

The Apple contractor told The Guardian that although the assistant often 'wakes' to the sound of a user's zipper, devices like the Apple Watch are frequently mistakenly awakened by a person raising their wrist. Apple's consumer-facing privacy material does not mention that people are listening to chats.

The whistleblower said to The Guardian that they had seen multiple instances of recordings showing sexual interactions, financial negotiations, allegedly illicit acts, and intimate conversations between physicians and patients.

Concerningly, user data like location, contact information, and app data is included with the recordings.

In response to the allegations, Apple said that just 1% of recordings were utilized to measure when the device was mistakenly triggered and to enhance its answers to customer requests. However, even 1% is not a negligible percentage: 500 million devices that support Siri are in use, according to statistics.

According to Apple, data including Apple ID numbers or names that may be used to identify a person is not stored. According to the company, a tiny sample of requests made to it are examined to improve Siri and dictation. The user's Apple ID is not connected to user requests. Siri replies are evaluated privately, and all evaluators must abide by Apple's rigorous confidentiality guidelines.

© 2024 iTech Post All rights reserved. Do not reproduce without permission.

More from iTechPost

Real Time Analytics