IOHIDeous — Mac 0day →
Siguza:
This is the tale of a macOS-only vulnerability in IOHIDFamily that yields kernel r/w and can be exploited by any unprivileged user.
Physical access not required. Apple is supposedly aware of it.
Siguza:
This is the tale of a macOS-only vulnerability in IOHIDFamily that yields kernel r/w and can be exploited by any unprivileged user.
Physical access not required. Apple is supposedly aware of it.
Keith Collins:
Since the beginning of 2017, Android phones have been collecting the addresses of nearby cellular towers—even when location services are disabled—and sending that data back to Google. The result is that Google, the unit of Alphabet behind Android, has access to data about individuals’ locations and their movements that go far beyond a reasonable consumer expectation of privacy.
Quartz observed the data collection occur and contacted Google, which confirmed the practice.
I wonder what would have happened had they not been caught, and I mean that with all the sarcasm in the world.
What scares me most is that people stopped caring about companies doing things like this. Sure, I care. Maybe even you care. But most people don’t.
Eric Newcomer:
Hackers stole the personal data of 57 million customers and drivers from Uber Technologies Inc., a massive breach that the company concealed for more than a year. This week, the ride-hailing firm ousted its chief security officer and one of his deputies for their roles in keeping the hack under wraps, which included a $100,000 payment to the attackers.
Compromised data from the October 2016 attack included names, email addresses and phone numbers of 50 million Uber riders around the world, the company told Bloomberg on Tuesday. The personal information of about 7 million drivers was accessed as well, including some 600,000 U.S. driver’s license numbers. No Social Security numbers, credit card information, trip location details or other data were taken, Uber said.
I deleted my account a year ago or so — maybe more — and have not looked back. I refuse to do business with a company this evil, which tries to sweep all of its failures under the rug.
Oleg Afonin:
The passcode. This is all that’s left of iOS security in iOS 11. If the attacker has your iPhone and your passcode is compromised, you lose your data; your passwords to third-party online accounts; your Apple ID password (and obviously the second authentication factor is not a problem). Finally, you lose access to all other Apple devices that are registered with your Apple ID; they can be wiped or locked remotely. All that, and more, just because of one passcode and stripped-down security in iOS 11.
This has been a very bad week or two for Apple.
Apple pushed a security update for the huge High Sierra vulnerability yesterday, introducing a bug while they were at it. You should install the update as soon as possible and then do this, if File Sharing isn’t working:
Open the Terminal app, which is in the Utilities folder of your Applications folder.
- Type
sudo /usr/libexec/configureLocalKDC
and press Return.- Enter your administrator password and press Return.
- Quit the Terminal app.
John Gruber summarized the problem, which seems to have been around for a few months now:
So the exploit was floating around, under the radar, for weeks at least, but it seems as though no widespread harm came of it.
Personally, I’d call this much too optimistic — people could have been hacked without them even realizing it.
iClarified:
Hacker xerub has posted the decryption key for Apple’s Secure Enclave Processor (SEP) firmware.
The security coprocessor was introduced alongside the iPhone 5s and Touch ID. It performs secure services for the rest of the SOC and prevents the main processor from getting direct access to sensitive data. It runs its own operating system (SEPOS) which includes a kernel, drivers, services, and applications […]
Decryption of the SEP Firmware will make it easier for hackers and security researchers to comb through the SEP for vulnerabilities.
Thuy Ong, writing for The Verge:
Now security researchers have found that the camera can be disabled and frozen from a program run from any computer within Wi-Fi range, reports Wired. That means a customer watching a delivery will only see a closed door, even if someone opens the door and goes inside — a vulnerability that may allow rogue couriers to rob customers’ homes.
This is exactly why I wouldn’t want to sign up for Amazon Key. While I understand that Amazon will try to make everything as secure as possible, everything can be hacked.
Amazon’s team clarified how they verify their drivers:
Every delivery driver passes a comprehensive background check that is verified by Amazon before they can make in-home deliveries, every delivery is connected to a specific driver, and before we unlock the door for a delivery, Amazon verifies that the correct driver is at the right address, at the intended time.
We have had multiple examples of insufficient background checks in law enforcement circles over the past few years and I seriously doubt Amazon can do better. Someone will always slip through the cracks. And that’s just the people behind the whole operation — the system can still be hacked.
From Amazon’s press release:
Amazon Key allows customers to have their packages securely delivered inside their home without having to be there. Using the Amazon Key app, customers stay in control and can track their delivery with real-time notifications, watch the delivery happening live or review a video of the delivery after it is complete.
No. Way. Why would anyone want to compromise the sanctity of their own home?1
Apple published a support document, detailing some interesting features and functions of Face ID.
Face ID automatically adapts to changes in your appearance, such as wearing cosmetic makeup or growing facial hair. If there is a more significant change in your appearance, like shaving a full beard, Face ID confirms your identity by using your passcode before it updates your face data. Face ID is designed to work with hats, scarves, glasses, contact lenses, and many sunglasses. Furthermore, it’s designed to work indoors, outdoors, and even in total darkness.
Face ID will be a problem for people who use anti-smog masks, which is pretty much most of Asia. This could be potentially solved by enrolling two faces — with and without a mask on — but as far I as understand, it is currently only possible to enroll one face per device. This could change in the future.
Face ID data – including mathematical representations of your face – is encrypted and protected with a key available only to the Secure Enclave.
The probability that a random person in the population could look at your iPhone X and unlock it using Face ID is approximately 1 in 1,000,000 (versus 1 in 50,000 for Touch ID). As an additional protection, Face ID allows only five unsuccessful match attempts before a passcode is required. The statistical probability is different for twins and siblings that look like you and among children under the age of 13, because their distinct facial features may not have fully developed. If you’re concerned about this, we recommend using a passcode to authenticate.
I would be extremely interested in seeing Face ID tested on twins. Luckily, I’m sure someone will attempt to.
Face ID matches against depth information, which isn’t found in print or 2D digital photographs. It’s designed to protect against spoofing by masks or other techniques through the use of sophisticated anti-spoofing neural networks. Face ID is even attention-aware. It recognizes if your eyes are open and looking towards the device. This makes it more difficult for someone to unlock your iPhone without your knowledge (such as when you are sleeping).
I won’t even try spoofing it with a photo, like I successfully spoofed my review Galaxy S8 — I’m pretty sure they got this covered.
Face ID data – including mathematical representations of your face – is encrypted and protected by the Secure Enclave. This data will be refined and updated as you use Face ID to improve your experience, including when you successfully authenticate. Face ID will also update this data when it detects a close match but a passcode is subsequently entered to unlock the device.
Face ID data doesn’t leave your device and is never backed up to iCloud or anywhere else.
Piece of mind.
Even if you don’t enroll in Face ID, the TrueDepth camera intelligently activates to support attention aware features, like dimming the display if you aren’t looking at your iPhone or lowering the volume of alerts if you’re looking at your device. For example, when using Safari, your device will check to determine if you’re looking at your device and turns the screen off if you aren’t. If you don’t want to use these features, you can open Settings > General > Accessibility, and disable Attention Aware Features.
Others have done this before, but it appears that Apple’s approach to implementing this feature is superior — at least it won’t pause playing video when a person looks away.
Within supported apps, you can enable Face ID for authentication. Apps are only notified as to whether the authentication is successful. Apps can’t access Face ID data associated with the enrolled face.
Craig Federighi already mentioned that apps not updated to support Face ID, but which support Touch ID, will work “out-of-the-box”.
The system will not cause any harm to eyes or skin, due to its low output. It’s important to know that the infrared emitters could be damaged during repair or disassembly, so your iPhone should always be serviced by Apple or an authorized service provider. The TrueDepth camera system incorporates tamper-detection features. If tampering is detected, the system may be disabled for safety reasons.
I’m sure some people will complain about issues with their TrueDepth camera being deactivated after an unauthorised screen exchange or some other service work, but I prefer to have piece of mind in this regard.
While I’m still not sold on Face ID — it could turn out to be a hassle — I’m very curious about the attention-aware features. Those could be a really nice perk.
Kate Conger, writing for Gizmodo:
Google has removed roughly 300 apps from its Play Store after security researchers from several internet infrastructure companies discovered that the seemingly harmless apps—offering video players and ringtones, among other features—were secretly hijacking Android devices to provide traffic for large-scale distributed denial of service (DDoS) attacks.
How many more have yet to be discovered?
This is yet another example of third-party libraries, plugins, or add-ons, which do things they aren’t supposed to:
DJI has removed a third-party plugin called JPush, which was introduced in March 2016 for iOS and May 2017 for Android. We implemented the plugin as a way to push notifications when video files are successfully uploaded to DJI’s SkyPixel video sharing platform. JPush assigns a unique JPush ID to each user and informs SkyPixel of this ID when the user chooses to upload a video. After uploading is complete, SkyPixel sends the user’s unique JPush ID back to the JPush server, triggering an “Upload Complete” notification on the user’s DJI GO or DJI GO 4 apps. By using JPush’s third-party plugin, DJI has allowed users to multitask while uploading large video files to SkyPixel occurs in the background of their app.
As a third-party company, JPush only needs to send and receive a minimal, narrowly-defined amount of data in order for this function to work properly. Recent work by DJI’s software security team and external researchers has discovered that JPush also collects extraneous packets of data, which include a list of apps installed on the user’s Android device, and sends them to JPush’s server. DJI did not authorize or condone either the collection or transmission of this data, and DJI never accessed this data. JPush has been removed from our apps, and DJI will develop new methods for providing app status updates that better protect our customers’ data.
I still don’t quite understand how and why developers and companies would choose to go down this route without a detailed check of what the used third-party code does precisely. Laziness, I guess.
John Gruber:
This is why it’s so great that iOS 11’s new easily-invoked Emergency SOS mode requires you to enter your passcode after invoking it. When you’re entering customs or in a situation where you’re worried you’re about to be arrested, you can quickly disable Touch ID without even taking your phone out of your pocket.
Until iOS 11 ships, it’s worth remembering that you’ve always been able to require your iPhone’s passcode to unlock it by powering it off. A freshly powered-on iPhone always requires the passcode to unlock.
This unfortunately does not help at borders, which you should take into account while traveling to countries such as Russia, China, USA, and Australia, amongst others:
In fact, US Customs and Border Protection has long considered US borders and airports a kind of loophole in the Constitution’s Fourth Amendment protections, one that allows them wide latitude to detain travelers and search their devices. For years, they’ve used that opportunity to hold border-crossers on the slightest suspicion, and demand access to their computers and phones with little formal cause or oversight.
Even citizens are far from immune. CBP detainees from journalists to filmmakers to security researchers have all had their devices taken out of their hands by agents.
John Gruber:
First, let’s dispose of the notion that Apple could have chosen to defy the Chinese government and keep the VPN apps in the App Store. Technically, Apple could have done that. But if they had, there would have been consequences. My guess is that the Chinese government would move to block all access to the App Store in China, or even block access to all Apple servers, period. This would effectively render all iOS devices mostly useless. iPhones have been sagging in popularity in China for a few years now — with no access to apps, their popularity would drop to zero. And Apple would have a lot of angry iPhone-owning users in China on its hands.
When I first saw how hard Apple was pushing into China, to expand its potential market, my only thought was, that they were in it for the money. Quite frankly, I believe they should leave China. What’s more, they should never have entered it. If they choose to remain there, then they should stand by their beliefs — today it’s VPNs, tomorrow it will be asking for access to iMessages or some other nonsense. At this point all Apple can do is “pray they don’t alter the deal further.”
While this is obviously a much deeper subject, Apple being in China with the iPhone always felt wrong to me.
David Gewirtz, for ZDnet:
First things first, iRobot will never sell your data. Our mission is to help you keep a cleaner home and, in time, to help the smart home and the devices in it work better.
iRobot further clarified:
This was a misinterpretation. Angle never said that iRobot would look to sell customer maps or data to other companies. iRobot has not had any conversations with other companies about data transactions, and iRobot will not sell customer data.
This is in response to Reuter’s report from a few days ago.
Jan Wolfe, reporting for Reuters:
Angle told Reuters that iRobot, which made Roomba compatible with Amazon’s Alexa voice assistant in March, could reach a deal to sell its maps to one or more of the Big Three in the next couple of years.
I was recently considering buying a Roomba or one of the copycats on the market but I have now changed my mind. I will gladly pay more for a product that does not make me the… product.
Kelly Fiveash, writing for Ars Technica:
Under the yet-to-be-implemented measures, free and fee-based porn operators—many of which are based abroad—will be required to insert age checkers on their sites in the UK, forcing users to dish up their credit card details to prove that they are 18 or over before being granted access to smut.
Hackers are dry-washing their hands right now.
Jeff Atwood:
But that was 4 years ago. Exactly how secure are our password hashes in the database today? Or 4 years from now, or 10 years from now? We’re building open source software for the long haul, and we need to be sure we are making reasonable decisions that protect everyone. So in the spirit of designing for evil, it’s time to put on our Darth Helmet and play the bad guy – let’s crack our own hashes!
We’re gonna use the biggest, baddest single GPU out there at the moment, the GTX 1080 Ti. As a point of reference, for PBKDF2-HMAC-SHA256 the 1080 achieves 1180 kH/s, whereas the 1080 Ti achieves 1640 kH/s. In a single video card generation the attack hash rate has increased nearly 40 percent. Ponder that.
In the meantime, despite it being 2017, some websites and services still limit users to short passwords. Microsoft’s Outlook is limited to 16 characters as far as I remember and I know of even lower limits.
Fixed the title. Jeff pastes some examples later, using alphanumeric examples, hence my mistake.
Sam Biddle:
The ShadowBrokers, an entity previously confirmed by The Intercept to have leaked authentic malware used by the NSA to attack computers around the world, today released another cache of what appears to be extremely potent (and previously unknown) software capable of breaking into systems running Windows. The software could give nearly anyone with sufficient technical knowledge the ability to wreak havoc on millions of Microsoft users.
Keep your system up-to-date!
James Vincent:
Following last week’s terrorist attack in London, the UK government has renewed a familiar campaign against digital encryption. Echoing criticisms made in 2015 by then prime minister David Cameron after the Charlie Hebdo attacks in Paris, UK home secretary Amber Rudd this weekend described the government’s inability to read messages on end-to-end encrypted messaging apps as “completely unacceptable.”
And so it begins… again. (Sigh.)
In an email, sent to their clients, Etihad informs:
Following a directive from US authorities, we have been advised that guests travelling to the United States from Abu Dhabi International Airport are not permitted to carry electronic devices larger than a cell phone or smart phone on board.
Mobile phones and medical devices are permitted but larger items including laptops, tablets, cameras and e-readers will need to be placed into baggage that is checked in. This must be done at the start of your journey. The ban does not affect flights leaving from the US towards Abu Dhabi and beyond.
These new rules come into effect for flights to the US via Abu Dhabi, starting 25 March.
Any guests travelling to the UK via Abu Dhabi are not affected by the directive from the UK authorities.
BBC UK:
The United States and United Kingdom have announced that laptops, e-readers and almost any other electronic device that is not a phone will be banned from cabin luggage on some flights.
The US rule only applies to 10 airports, but one of those is the world’s busiest international airport – Dubai International.
For more details, including a full list of countries and airports, see the original piece on BBC’s site.
Sam Thielman:
US authorities have required airlines from 13 nations to forbid passengers from carrying any electronic or electrical device larger than a cellphone.
The new edict was distributed in an email described as “confidential” from the US transportation safety administration (TSA) on Monday.
The requirement forbids passengers from bringing laptops, iPads, Kindles and even cameras larger than mobile phones into the cabin. All such devices must be checked.
Devices with lithium-ion batteries aren’t allowed in checked baggage as far as I know. So this is basically a ban on close to everything apart from cellphones. I know I won’t be visiting the US anytime soon, even though the ban will most probably not include my home country — I can’t bring myself to condone this sort of behaviour.
Google Security Blog:
Today, 10 years after of SHA-1 was first introduced, we are announcing the first practical technique for generating a collision. This represents the culmination of two years of research that sprung from a collaboration between the CWI Institute in Amsterdam and Google. We’ve summarized how we went about generating a collision below. As a proof of the attack, we are releasing two PDFs that have identical SHA-1 hashes but different content.
Quincy Larson:
It’s only a matter of time before downloading the contents of people’s phones becomes a standard procedure for entering every country. This already happens in Canada. And you can bet that countries like China and Russia aren’t far behind (…)
When you travel internationally, you should leave your mobile phone and laptop at home (…)
Is all this inconvenient? Absolutely. But it’s the only sane course of action when you consider the gravity of your data falling into the wrong hands.
If you bother locking your doors at night, you should bother securing your phone’s data during international travel.
At this point in our history, seeing what’s happening in the US, I would definitely not bring my iPhone with me. A cheap, secondary smartphone, which I can configure with what I need after I’ve been let in, would be more than sufficient.
Jonathan Zdziarski:
My inbox has been lighting up with questions about Confide, after it was allegedly found to have been used by staffers at the White House. I wish I had all of the free time that reporters think I have (I’d be so happy, living life as a broke beach bum). I did spend a little bit of time, however reverse engineering the binary and doing a simple forensic examination of it. Here’s my “literature in a rush” version.
Jonathan Zdziarski:
(…) This article is a brief how-to on using Apple’s Configurator utility to lock your device down so that no other devices can pair with it, even if you leave your device unlocked, or are compelled into unlocking it yourself with a passcode or a fingerprint. By pair-locking your device, you’re effectively disabling every logical forensics tool on the market by preventing it from talking to your iOS device, at least without first being able to undo this lock with pairing records from your desktop machine. This is a great technique for protecting your device from nosy coworkers, or cops in some states that have started grabbing your call history at traffic stops.
Metropolitan Police:
Count 3: Preparation for terrorism. Between 31 December 2015 and 22 September 2016 Samata Ullah, with the intention of assisting another or others to commit acts of terrorism, engaged in conduct in preparation for giving effect to his intention namely, by researching an encryption programme, developing an encrypted version of his blog site and publishing the instructions around the use of programme on his blog site. Contrary to section 5 Terrorism Act 2006.
I can understand the other charges, but how is using HTTPS a criminal offence?
Rick Falkvinge has a few interesting comments on the subject:
(…) four years ago, I predicted that the UK won’t just jail you for encryption, but for carrying astronomical noise, too. It’s already a crime to not give up keys to an encrypted document in the UK (effectively making encryption illegal), but it’s worse than that – it’s a five-years-in-prison offense to not give up the keys to something that appears encrypted to law enforcement, but may not actually be. In other words, carrying astronomical noise is a jailable offense, because it is indistinguishable from something encrypted, unless you can pull the documents the police claim are hidden in the radio noise from a magic hat. This case takes the UK significantly closer to such a reality, with charging a person for terrorism (!) merely for following privacy best practices.
Jo Becker, Adam Goldman, Michael S. Schmidt and Matt Apuzzo:
The F.B.I. secretly arrested a National Security Agency contractor in recent weeks and is investigating whether he stole and disclosed highly classified computer code developed to hack into the networks of foreign governments, according to several senior law enforcement and intelligence officials.
The theft raises the embarrassing prospect that for the second time in three years, an insider has managed to steal highly damaging secret information from the N.S.A. In 2013, Edward J. Snowden, who was also a contractor for the agency, took a vast trove of documents that were later passed to journalists, exposing N.S.A. surveillance programs in the United States and abroad.
What if Harold T. Martin III had also stolen the ‘golden keys’ to backdoors of various tech companies infrastructures? How long would it take for anyone and everyone in the world to get a peek into the lives of people using those services?
Andrea Peterson for The Washington Post reporting on Stamos’ (Yahoo’s Chief Information Security Officer) and Rogers’ (director of the National Security Agency) debate:
“If we’re going to build defects/backdoors or golden master keys for the U.S. government, do you believe we should do so — we have about 1.3 billion users around the world — should we do for the Chinese government, the Russian government, the Saudi Arabian government, the Israeli government, the French government?” Stamos asked.
“So, I’m not gonna… I mean, the way you framed the question isn’t designed to elicit a response,” Rogers replied.
“Well, do you believe we should build backdoors for other countries?” Stamos asked again.
“My position is — hey look, I think that we’re lying that this isn’t technically feasible. Now, it needs to be done within a framework. I’m the first to acknowledge that. You don’t want the FBI and you don’t want the NSA unilaterally deciding, so, what are we going to access and what are we not going to access? That shouldn’t be for us. I just believe that this is achievable. We’ll have to work our way through it. And I’m the first to acknowledge there are international implications. I think we can work our way through this,” Rogers answered.
“So you do believe then, that we should build those for other countries if they pass laws?” Stamos asked a third time.
“I think we can work our way through this,” Rogers replied.
“I’m sure the Chinese and Russians are going to have the same opinion,” Stamos said.
I truly wonder what Rogers would think if he wasn’t the director of the NSA. Would he agree to all the snooping, reduced security, and compromised privacy, if he were just a civilian?
Joseph Menn:
Yahoo Inc last year secretly built a custom software program to search all of its customers’ incoming emails for specific information provided by U.S. intelligence officials, according to people familiar with the matter.
The company complied with a classified U.S. government demand, scanning hundreds of millions of Yahoo Mail accounts at the behest of the National Security Agency or FBI, said three former employees and a fourth person apprised of the events.
Some surveillance experts said this represents the first case to surface of a U.S. Internet company agreeing to an intelligence agency’s request by searching all arriving messages, as opposed to examining stored messages or scanning a small number of accounts in real time.
While Apple, Google, and others want to fight these types of government demands, Yahoo rolls over and helps them out. Completely unacceptable.