Apple Dropped Plan for Encrypting iCloud Backups →

January 21, 2020 · 15:11

Joseph Menn, reporting for Reuters:

More than two years ago, Apple told the FBI that it planned to offer users end-to-end encryption when storing their phone data on iCloud, according to one current and three former FBI officials and one current and one former Apple employee.

Under that plan, primarily designed to thwart hackers, Apple would no longer have a key to unlock the encrypted data, meaning it would not be able to turn material over to authorities in a readable form even under court order.

In private talks with Apple soon after, representatives of the FBI’s cyber crime agents and its operational technology division objected to the plan, arguing it would deny them the most effective means for gaining evidence against iPhone-using suspects, the government sources said.

When Apple spoke privately to the FBI about its work on phone security the following year, the end-to-end encryption plan had been dropped, according to the six sources. Reuters could not determine why exactly Apple dropped the plan.

“Legal killed it, for reasons you can imagine,” another former Apple employee said he was told, without any specific mention of why the plan was dropped or if the FBI was a factor in the decision.

That person told Reuters the company did not want to risk being attacked by public officials for protecting criminals, sued for moving previously accessible data out of reach of government agencies or used as an excuse for new legislation against encryption.

If this is true, then Apple’s pro-privacy campaign is only true if you refrain from using iCloud. Unfortunately, iCloud Backup is the only automatic backup system supported by iOS, although you can go back to making local and secure iTunes backups instead. We of course have no real clue whether our particular backups were accessed or not, but I assume nobody is searching people’s data who stay away from legal trouble.

That said, Apple should definitely introduce end-to-end encryption for iCloud backups, or educate its users about the dangers of using iCloud Backup at the very least.


FBI Hacker Says Apple Are ‘Jerks’ and ‘Evil Geniuses’ for Encrypting iPhones →

January 12, 2018 · 10:29

Lorenzo Franceschi-Bicchierai, writing for Motherboard:

On Wednesday, at the the International Conference on Cyber Security in Manhattan, FBI forensic expert Stephen Flatley lashed out at Apple, calling the company “jerks,” and “evil geniuses” for making his and his colleagues’ investigative work harder. For example, Flatley complained that Apple recently made password guesses slower, changing the hash iterations from 10,000 to 10,000,000.

I’m glad his work is made harder and I can’t help but wonder what smartphone he uses privately and if he would want it to be unencrypted.


About Face ID advanced technology →

October 19, 2017 · 12:21

Apple published a support document, detailing some interesting features and functions of Face ID.

Face ID automatically adapts to changes in your appearance, such as wearing cosmetic makeup or growing facial hair. If there is a more significant change in your appearance, like shaving a full beard, Face ID confirms your identity by using your passcode before it updates your face data. Face ID is designed to work with hats, scarves, glasses, contact lenses, and many sunglasses. Furthermore, it’s designed to work indoors, outdoors, and even in total darkness.

Face ID will be a problem for people who use anti-smog masks, which is pretty much most of Asia. This could be potentially solved by enrolling two faces — with and without a mask on — but as far I as understand, it is currently only possible to enroll one face per device. This could change in the future.

Face ID data – including mathematical representations of your face – is encrypted and protected with a key available only to the Secure Enclave.

The probability that a random person in the population could look at your iPhone X and unlock it using Face ID is approximately 1 in 1,000,000 (versus 1 in 50,000 for Touch ID). As an additional protection, Face ID allows only five unsuccessful match attempts before a passcode is required. The statistical probability is different for twins and siblings that look like you and among children under the age of 13, because their distinct facial features may not have fully developed. If you’re concerned about this, we recommend using a passcode to authenticate.

I would be extremely interested in seeing Face ID tested on twins. Luckily, I’m sure someone will attempt to.

Face ID matches against depth information, which isn’t found in print or 2D digital photographs. It’s designed to protect against spoofing by masks or other techniques through the use of sophisticated anti-spoofing neural networks. Face ID is even attention-aware. It recognizes if your eyes are open and looking towards the device. This makes it more difficult for someone to unlock your iPhone without your knowledge (such as when you are sleeping).

I won’t even try spoofing it with a photo, like I successfully spoofed my review Galaxy S8 — I’m pretty sure they got this covered.

Face ID data – including mathematical representations of your face – is encrypted and protected by the Secure Enclave. This data will be refined and updated as you use Face ID to improve your experience, including when you successfully authenticate. Face ID will also update this data when it detects a close match but a passcode is subsequently entered to unlock the device.

Face ID data doesn’t leave your device and is never backed up to iCloud or anywhere else.

Piece of mind.

Even if you don’t enroll in Face ID, the TrueDepth camera intelligently activates to support attention aware features, like dimming the display if you aren’t looking at your iPhone or lowering the volume of alerts if you’re looking at your device. For example, when using Safari, your device will check to determine if you’re looking at your device and turns the screen off if you aren’t. If you don’t want to use these features, you can open Settings > General > Accessibility, and disable Attention Aware Features.

Others have done this before, but it appears that Apple’s approach to implementing this feature is superior — at least it won’t pause playing video when a person looks away.

Within supported apps, you can enable Face ID for authentication. Apps are only notified as to whether the authentication is successful. Apps can’t access Face ID data associated with the enrolled face.

Craig Federighi already mentioned that apps not updated to support Face ID, but which support Touch ID, will work “out-of-the-box”.

The system will not cause any harm to eyes or skin, due to its low output. It’s important to know that the infrared emitters could be damaged during repair or disassembly, so your iPhone should always be serviced by Apple or an authorized service provider. The TrueDepth camera system incorporates tamper-detection features. If tampering is detected, the system may be disabled for safety reasons.

I’m sure some people will complain about issues with their TrueDepth camera being deactivated after an unauthorised screen exchange or some other service work, but I prefer to have piece of mind in this regard.


While I’m still not sold on Face ID — it could turn out to be a hassle — I’m very curious about the attention-aware features. Those could be a really nice perk.


iPhone Secure Enclave Firmware Key Found →

August 21, 2017 · 08:49

David Schuetz:

Earlier today, it was reported that a hacker/researcher called “xerub” had released the encryption key, and tools to use it, for the firmware that runs the Secure Enclave Processor (SEP) on iPhone 5S. Reporting was…breathless. Stories suggested that this move was “destroying key piece of iOS mobile security,” and that we should “be on the lookout for Touch ID hacks” and “password harvesting scams.”

Is it really that bad? No, not really (…)

What was released today was the key to decrypt that firmware, but not a key to decrypt the region of disk used by the SE to store data. So now we can actually reverse-engineer the SE system, and hopefully gain a much better understanding of how it works. But we can’t decrypt the data it processes.


Leaked NSA Malware Threatens Windows Users Around the World →

April 20, 2017 · 13:57

Sam Biddle:

The ShadowBrokers, an entity previously confirmed by The Intercept to have leaked authentic malware used by the NSA to attack computers around the world, today released another cache of what appears to be extremely potent (and previously unknown) software capable of breaking into systems running Windows. The software could give nearly anyone with sufficient technical knowledge the ability to wreak havoc on millions of Microsoft users.

Keep your system up-to-date!


UK Government Renews Calls for WhatsApp Backdoor After London Attack →

March 27, 2017 · 12:13

James Vincent:

Following last week’s terrorist attack in London, the UK government has renewed a familiar campaign against digital encryption. Echoing criticisms made in 2015 by then prime minister David Cameron after the Charlie Hebdo attacks in Paris, UK home secretary Amber Rudd this weekend described the government’s inability to read messages on end-to-end encrypted messaging apps as “completely unacceptable.”

And so it begins… again. (Sigh.)


Announcing the First SHA1 Collision →

February 23, 2017 · 20:22

Google Security Blog:

Today, 10 years after of SHA-1 was first introduced, we are announcing the first practical technique for generating a collision. This represents the culmination of two years of research that sprung from a collaboration between the CWI Institute in Amsterdam and Google. We’ve summarized how we went about generating a collision below. As a proof of the attack, we are releasing two PDFs that have identical SHA-1 hashes but different content.


Quincy Larson: “I’ll Never Bring My Phone on an International Flight Again” →

February 18, 2017 · 14:33

Quincy Larson:

It’s only a matter of time before downloading the contents of people’s phones becomes a standard procedure for entering every country. This already happens in Canada. And you can bet that countries like China and Russia aren’t far behind (…)

When you travel internationally, you should leave your mobile phone and laptop at home (…)

Is all this inconvenient? Absolutely. But it’s the only sane course of action when you consider the gravity of your data falling into the wrong hands.

If you bother locking your doors at night, you should bother securing your phone’s data during international travel.

At this point in our history, seeing what’s happening in the US, I would definitely not bring my iPhone with me. A cheap, secondary smartphone, which I can configure with what I need after I’ve been let in, would be more than sufficient.


Confide: A Quick Look →

February 16, 2017 · 13:05

Jonathan Zdziarski:

My inbox has been lighting up with questions about Confide, after it was allegedly found to have been used by staffers at the White House. I wish I had all of the free time that reporters think I have (I’d be so happy, living life as a broke beach bum). I did spend a little bit of time, however reverse engineering the binary and doing a simple forensic examination of it. Here’s my “literature in a rush” version.


Scotland Yard Accuses Man of Terrorism; One Count for Using HTTPS on His Blog →

October 10, 2016 · 18:25

Metropolitan Police:

Count 3: Preparation for terrorism. Between 31 December 2015 and 22 September 2016 Samata Ullah, with the intention of assisting another or others to commit acts of terrorism, engaged in conduct in preparation for giving effect to his intention namely, by researching an encryption programme, developing an encrypted version of his blog site and publishing the instructions around the use of programme on his blog site. Contrary to section 5 Terrorism Act 2006.

I can understand the other charges, but how is using HTTPS a criminal offence?

Rick Falkvinge has a few interesting comments on the subject:

(…) four years ago, I predicted that the UK won’t just jail you for encryption, but for carrying astronomical noise, too. It’s already a crime to not give up keys to an encrypted document in the UK (effectively making encryption illegal), but it’s worse than that – it’s a five-years-in-prison offense to not give up the keys to something that appears encrypted to law enforcement, but may not actually be. In other words, carrying astronomical noise is a jailable offense, because it is indistinguishable from something encrypted, unless you can pull the documents the police claim are hidden in the radio noise from a magic hat. This case takes the UK significantly closer to such a reality, with charging a person for terrorism (!) merely for following privacy best practices.


How to Crack Android Full Disk Encryption on Qualcomm Devices →

July 25, 2016 · 09:39

Mohit Kumar:

Android users are at severe risk when it comes to encryption of their personal and sensitive data.

Android’s full-disk encryption can be cracked much more easily than expected with brute force attack and some patience, affecting potentially hundreds of millions of mobile devices.

And the worst part: There may not be a full fix available for current Android handsets in the market.


How iMessage Distributes Security to Block “Phantom Devices” →

April 22, 2016 · 12:39

Securosis:

Overall it’s a solid balance of convenience and security. Especially when you consider there are a billion Apple devices out there. iMessage doesn’t eliminate the need for true zero-knowledge messaging systems, but it is extremely secure, especially when you consider that it’s basically a transparent replacement for text messaging.

This is a good read if you’re interested in the security of iMessage. It’s basically very secure, but I’m sure Apple will continue improving their standards.


New Bill Would Require Companies to Decrypt Data on Demand →

April 10, 2016 · 13:00

Russell Brandom:

If the bill becomes law, Apple and other companies will have a much harder time resisting similar legal demands. Essentially any hard encryption — that is, encryption that cannot be broken by the company providing it — would be in violation of the proposed measures, presenting a massive problem for a broad range of tech companies.

I did not expect to see a bill this quickly. Quite frankly, I expected people to be intelligent and not even try to pass this sort of garbage.

My bad.


WhatsApp Just Switched on Encryption →

April 6, 2016 · 21:24

Cade Metz:

This means that if any group of people uses the latest version of WhatsApp—whether that group spans two people or ten—the service will encrypt all messages, phone calls, photos, and videos moving among them. And that’s true on any phone that runs the app, from iPhones to Android phones to Windows phones to old school Nokia flip phones. With end-to-end encryption in place, not even WhatsApp’s employees can read the data that’s sent across its network. In other words, WhatsApp has no way of complying with a court order demanding access to the content of any message, phone call, photo, or video traveling through its service. Like Apple, WhatsApp is, in practice, stonewalling the federal government, but it’s doing so on a larger front—one that spans roughly a billion devices.

I can’t help but wonder if/when encryption will be illegal in the United States, UK, and France — these three countries seem to be the ones who want it gone most. It should of course never come to that. And I truly hope it doesn’t.

Also: Wired’s title is completely baffling. We should never forget about the Apple vs. FBI kerfuffle.


Apple’s Statement on Closing of the San Bernardino Case →

March 29, 2016 · 07:20

Rene Ritchie posted Apple’s statement on iMore:

From the beginning, we objected to the FBI’s demand that Apple build a backdoor into the iPhone because we believed it was wrong and would set a dangerous precedent. As a result of the government’s dismissal, neither of these occurred. This case should never have been brought.

We will continue to help law enforcement with their investigations, as we have done all along, and we will continue to increase the security of our products as the threats and attacks on our data become more frequent and more sophisticated.

Apple believes deeply that people in the United States and around the world deserve data protection, security and privacy. Sacrificing one for the other only puts people and countries at greater risk.

This case raised issues which deserve a national conversation about our civil liberties, and our collective security and privacy. Apple remains committed to participating in that discussion.

Though this particular case is over, the war goes on, and I’m certain this issue will appear in the news sooner or later.


Apple’s San Bernardino Fight Is Over as FBI Gains Access to iPhone →

March 29, 2016 · 05:52

Russell Brandom:

After months of work, the FBI finally has a way into the San Bernardino iPhone. In a court filing today, prosecutors told the court the new method for breaking into the phone is sound, and Apple’s assistance is no longer required. “The government has now successfully accessed the data stored on Farook’s iPhone,” the filing reads, “and therefore no longer requires assistance from Apple.” The filing provides no further details on the nature of the new method. Still, the result effectively finishes the court fight that has consumed Apple since February.

Question is: will they now go after Congress to ban encryption, or try to weaken it by law?


Apple to Hand iCloud Encryption Keys to Users →

March 18, 2016 · 19:12

Wayne Rash:

According to a number of press reports, Apple is in the process of revamping its iCloud storage service to increase security by divesting itself of the task of keeping users’ encryption keys.

Currently Apple keeps the keys to access iCloud accounts, which means, among other things, that Apple can provide information to authorities when presented with a warrant. The company provided such information from the iCloud account of Sayed Farook, the terrorist who killed 14 county employees late last year in in San Bernardino, Calif. Apparently that’s now about to change. If the reports are correct, Apple is planning to offload the storage of encryption keys so that users control their keys, and they’re accessible only through a password.

This way, even Apple cannot gain access to your encrypted data, no matter how much it may want to and no matter how many government subpoenas it receives. It can’t honor court orders to provide the data because the company has no way to decrypt it.

This is to be expected. I’d like to think that Apple would have gone down this route without the current FBI fiasco taking place, but perhaps the latest events have just accelerated their plans.


Apple Encryption Engineers, if Ordered to Unlock iPhone, Might Resist →

March 18, 2016 · 19:07

John Markoff, Katie Benner & Brian X. Chen:

Apple employees are already discussing what they will do if ordered to help law enforcement authorities. Some say they may balk at the work, while others may even quit their high-paying jobs rather than undermine the security of the software they have already created, according to more than a half-dozen current and former Apple employees.

Among those interviewed were Apple engineers who are involved in the development of mobile products and security, as well as former security engineers and executives.

I can’t help but wonder how far this will go.


Facebook, Google and WhatsApp Plan to Increase Encryption of User Data →

March 14, 2016 · 20:38

Danny Yadron:

Silicon Valley’s leading companies – including Facebook, Google and Snapchat – are working on their own increased privacy technology as Apple fights the US government over encryption, the Guardian has learned.

The projects could antagonize authorities just as much as Apple’s more secure iPhones, which are currently at the center of the San Bernardino shooting investigation. They also indicate the industry may be willing to back up their public support for Apple with concrete action.

Within weeks, Facebook’s messaging service WhatsApp plans to expand its secure messaging service so that voice calls are also encrypted, in addition to its existing privacy features. The service has some one billion monthly users. Facebook is also considering beefing up security of its own Messenger tool.

Snapchat, the popular ephemeral messaging service, is also working on a secure messaging system and Google is exploring extra uses for the technology behind a long-in-the-works encrypted email project.

At this point in time I would like to see more action from the other tech companies — this is obviously a delicate situation, but too much is at stake.


The Sequel to the Crypto Wars →

March 14, 2016 · 20:13

Steven Levy:

As with the first round of the crypto wars, the stakes could not be higher. Once again, the government is seeking to control that genie first released by Diffie and Hellman. But the physics of computer security have not changed. Last July, a panel of fifteen eminent security specialists and cryptographers — many of whom are veterans of the first crypto war — released a report confirming there was no way for the government to demand a means of bypassing encryption without a dire compromise of security. It just doesn’t work.

There is no middle ground.


Barack Obama: ‘Smartphones Can’t Be Allowed to Be Black Boxes’ →

March 13, 2016 · 10:38

Justin Sink:

President Barack Obama said Friday that smartphones — like the iPhone the FBI is trying to force Apple Inc. to help it hack — can’t be allowed to be “black boxes,” inaccessible to the government. The technology industry, he said, should work with the government instead of leaving the issue to Congress.

“You cannot take an absolutist view on this,” Obama said at the South by Southwest festival in Austin, Texas. “If your argument is strong encryption no matter what, and we can and should create black boxes, that I think does not strike the kind of balance we have lived with for 200, 300 years, and it’s fetishizing our phones above every other value.”

I’m disappointed in Obama. I also don’t think he knows exactly what he’s talking about.


WhatsApp Encryption Targeted by DOJ →

March 13, 2016 · 10:35

Matt Apuzzo:

But in late 2014, the company said that it would begin adding sophisticated encoding, known as end-to-end encryption, to its systems. Only the intended recipients would be able to read the messages.

“WhatsApp cannot provide information we do not have,” the company said this month when Brazilian police arrested a Facebook executive after the company failed to turn over information about a customer who was the subject of a drug trafficking investigation.

The iPhone case, which revolves around whether Apple can be forced to help the F.B.I. unlock a phone used by one of the killers in last year’s San Bernardino, Calif., massacre, has received worldwide attention for the precedent it might set. But to many in law enforcement, disputes like the one with WhatsApp are of far greater concern.

For more than a half-century, the Justice Department has relied on wiretaps as a fundamental crime-fighting tool. To some in law enforcement, if companies like WhatsApp, Signal and Telegram can design unbreakable encryption, then the future of wiretapping is in doubt.


Warrant-Proof Places →

March 13, 2016 · 10:13

Jonathan Zdziarski:

We, as everyday Americans, should also encourage the idea of warrant proof places. The DOJ believes, quite erroneously, that the Fourth Amendment gives them the right to any evidence or information they desire with a warrant. The Bill of Rights did not grant rights to the government; it protected the rights of Americans from the overreach that was expected to come from government. Our most intimate thoughts, our private conversations, our ideas, our -intent- are all things our phone tracks. These are concepts that must remain private (if we choose to protect them) for any functioning free society. In today’s technological landscape, we are no longer giving up just our current or future activity under warrant, but for the first time in history, making potentially years of our life retroactively searchable by law enforcement. Things are recorded in ways today that no one would have imagined, even when CALEA was passed. The capability that DOJ is asserting is that our very lives and identities – going back across years – are subject to search. The Constitution never permitted this.


Craig Federighi on iOS Security for the Washington Post →

March 7, 2016 · 09:57

Craig Federighi:

Security is an endless race — one that you can lead but never decisively win. Yesterday’s best defenses cannot fend off the attacks of today or tomorrow. Software innovations of the future will depend on the foundation of strong device security. We cannot afford to fall behind those who would exploit technology in order to cause chaos. To slow our pace, or reverse our progress, puts everyone at risk.

This is not just about protecting the data on our phones. This is about keeping all of our lives and data private, which we store on miniature computers in our pockets.


FBI & DA Misleading Courts and Public for their Own Agenda →

March 6, 2016 · 10:53

Brandon Bailey:

But the idea that Farook might have used the phone to transmit a “lying-dormant cyber pathogen” into county data systems is a new one. Ramos’ office, however, cited it in a court filing Thursday among several other reasons to support the government’s position.

“This was a county employee that murdered 14 people and injured 22,” Ramos said. “Did he use the county’s infrastructure? Did he hack into that infrastructure? I don’t know. In order for me to really put that issue to rest, there is one piece of evidence that would absolutely let us know that, and that would be the iPhone.”

The argument drew condemnation from one software expert who has signed a brief in support of Apple’s position.

“Ramos’s statements are not only misleading to the court, but amount to blatant fear mongering,” independent software researcher Jonathan Zdziarski wrote in a post on his personal blog .

Other security experts who haven’t taken sides also discounted the scenario. “It’s definitely possible, technically, but it doesn’t seem to me at first glance to be likely,” said David Meltzer, a computer security expert and chief research officer at Tripwire, a commercial IT security firm. He said Apple’s iPhone operating system is a relatively closed environment that’s designed so users can’t easily introduce their own programs.

Ramos, meanwhile, said he’d heard about social media posts that mocked the term “cyber pathogen,” which is not generally used by tech experts. “When they do that,” he said, “they’re mocking the victims of this crime, of this horrible terrorist attack.”

Using the victims of a terrorist attack to further their own agenda however, that’s much worse.