By John Cox, Network World | July 22nd, 2014
Apple has “inadvertently admitted” to creating a “backdoor” in iOS, according to a post by a forensics scientist, iOS author and ex-hacker.
Apple has “inadvertently admitted” to creating a “backdoor” in iOS, according to a new post by a forensics scientist, iOS author and former hacker, who this week created a stir when he posted a presentation laying out his case.
Apple has created “several services and mechanisms” that let Apple — and, potentially, government agencies or malicious third parties — extract lots of personal data from iOS devices, says Jonathan Zdziarski. There is, he says, no way to shut off this data leakage and there is no explicit consent granted by endusers.
He made his case in a talk, “Identifying back doors, attack points, and surveillance mechanisms in iOS devices,” [available in PDF] at the annual HOPE X hackers conference last week in New York City. The talk was based on a paper published in the March issue of “Digital Investigation,” which can be ordered online.
Essentially, Zdziarski says that Apple over time has deliberately added several “undocumented high-value forensic services” in iOS, along with “suspicious design omissions…that make collection easier.” The result is these services can copy a wide range of a user’s personal data, and bypass Apple’s backup encryption. That gives Apple, and potentially government agencies, such as the National Security Agency, or just bad people intent on exploiting these service, the ability to extract personal data without the user knowing this is happening.
In the past two years, Apple has become much more open about the iOS security architecture, and how and why it’s making changes to it, according to security professionals and IT consultants who are praising both the company’s transparency and its approach to protecting iOS devices, Internet security and users’ data. [see “Apple reveals unprecedented details in iOS security”] The latest Apple-authored iOS Security whitepaper is available as a PDF.
The Zdziarski presentation slides were the basis of a round of summarizing news and blog postings about his claims, such as this one at ZDNet by Jason O’Grady. But Apple responded officially to a query by Rene Ritchie, editor of the Apple-focused iMore website, saying it had never worked with “any government agency…to create a backdoor in any of our products.” Here’s the text of the Apple response, as posted by Ritchie:
“We have designed iOS so that its diagnostic functions do not compromise user privacy and security, but still provides needed information to enterprise IT departments, developers and Apple for troubleshooting technical issues. A user must have unlocked their device and agreed to trust another computer before that computer is able to access this limited diagnostic data. The user must agree to share this information, and data is never transferred without their consent. As we have said before, Apple has never worked with any government agency from any country to create a backdoor in any of our products or services.”
The last sentence, which is the one Ritchie emphasized in his headline and at the top of his blogpost, is very carefully worded. Zdziarski doesn’t allege that Apple worked with the NSA, or any other agency, to create a backdoor. He alleges that Apple itself created undocumented services, which can be used by Apple, and potentially by someone like the NSA, to extract personal data.
Zdziarski did his own parsing of the Apple statement, in a new post on his blog: “[I]t looks like Apple might have inadvertently admitted that, in the classic sense of the word, they do indeed have back doors in iOS, however [they] claim that the purpose is for ‘diagnostics’ and ‘enterprise.'”
Ritchie succinctly summarizes the mechanism identified by Zdziarski, which involves two explicit decisions by the end user: “When you connect your iPhone or iPad to iTunes on Mac or Windows — and choose to trust that computer — a pairing record is created that maintains that trust for future connections. Zdziarski claimed that if someone takes physical possession of that computer, they can steal those pairing records, connect to your device, and retrieve your personal information and/or enable remote logging. If they don’t have your computer, Zdziarski claimed they can try and generate a pairing record by tricking you into connecting to a compromised accessory, like a dock (juice jacking), and/or by using mobile device management (MDM) tools intended for [the] enterprise to get around safeguards like Apple’s Trusted Device requestor.”
In his own post, Zdziarski repeats his contention that pairing records can be stolen in all kinds of ways, and acknowledges that every operating system has legitimate diagnostic capabilities. But he’s not convinced by the Apple statement.
“I don’t buy for a minute that these services are intended solely for diagnostics,” he writes. “The data they leak is of an extreme personal nature. There is no notification to the user. A real diagnostic tool would have been engineered to respect the user, prompt them like applications do for access to data, and respect backup encryption. Tell me, what is the point in promising the user encryption if there is a back door to bypass it?”
In a separate post, Zdziarski says he is not accusing Apple of working with the NSA nor is he “suggesting some grand conspiracy.”
“[T]here are, however, some services running in iOS that shouldn’t be there, that were intentionally added by Apple as part of the firmware, and that bypass backup encryption while copying more of your personal data than ever should come off the phone for the average consumer,” he writes. “I think at the very least, this warrants an explanation and disclosure to the some 600 million customers out there running iOS devices….My hope is that Apple will correct the problem. Nothing less, nothing more. I want these services off my phone. They don’t belong there.”
John Cox covers wireless networking and mobile computing for “Network World.”