Apple has released iOS 14, with a bucketload of new and improved functional features and a handful of privacy and security ones.
New privacy and security features in iOS 14
The new iOS will tell you when an app is using your camera or microphone
It will show an indicator dot (green for when camera or camera+microphone is in use, orange for microphone) in the top right part of the device’s screen.
The downside is that it’s fairly small and you might miss it if other things are happening on the screen. The upside is that you can check which app most recently used your camera or microphone via the Control Center.
Of course, you can deny access to your camera and microphone to any app through the Privacy settings.
You can share with apps your approximate location instead of the precise one
Go to Settings > Privacy and Location Services > Location Services, and you can configure for each app whether you want it to access your device’s location “only while the app is in use”, “always”, “never”, or you want the app to ask you for permission each time you run it (then you get the option to give it permission to access your location “Only once”).
When you allow location access for an app, you’ll get the option to provide your precise location or leave it to the app to determine your approximate location (the latter is good enough for apps that show local news or weather).
You can choose to share with apps just some photos
Under Privacy > Photos you can see which apps have requested access to your photos and you can choose to restrict each app’s access just to selected photos or photo albums (or none).
You can limit tracking
Each time you connect to a Wi-Fi network your phone will show a different MAC address. This is to prevent ISPs and advertisers to track your movements (i.e., see when and where you connect to a network), and this option is on by default.
In Settings > Privacy > Tracking, you can choose to not allow apps to send you a request to track you. If you do that, “any app that attempts to ask you for your permission will be blocked from asking and automatically informed that you have requested not to be tracked. In addition, all apps, other than those that you have previously given permission to track, will be blocked from accessing the device’s Advertising Identifier.”
If you allow tracking, tracking permissions can also be controlled on a per-app basis.
It has to be pointed out, though, that these app tracking options will start working as intended in early 2021, when these privacy controls become mandatory for developers.
“We want to give developers the time they need to make the necessary changes, and as a result, the requirement to use this tracking permission will go into effect early next year,” Apple explained.
Facebook complained earlier this year that these new privacy requirements would have a significant negative impact on its advertising business.
You will be able to see a summary of an app’s privacy practices before you download it from the App Store
You still can’t see these because app developers have yet to roll them out, but when they are ready, you’ll be able to peruse these summaries through a “App Privacy” button on the listing in the store, and they will look something like this:
You’ll be able to see which tracking cookies have been blocked
The Safari mobile browser has been updated to show a Privacy Report, which shows all the cross-site tracking cookies it has blocked in the last 30 days if you turned on Prevent Cross-Site Tracking in Safari’s Privacy and Security Settings.
The report is accessible from the AA menu in the browser’s address bar.
You’ll be notified if a password you stored in the iCloud Keychain has been spotted in a known data breach
To turn this option on, go to Settings > Passwords > Security Recommendations and toggle on Detect Compromised Passwords. For the secure password monitoring to work, iCloud Keychain has to be enabled.
In iOS 14, Apple has also fixed a number of security vulnerabilities, including:
- A vulnerability in an integrated drive electronics (IDE) component that could allow a remote authenticated attacker to execute arbitrary code on a paired device during a debug session over the network (CVE-2020-9992), and a
- A logic issue affecting the sandbox that may allow a malicious application to access restricted files (CVE-2020-9968)
New Bluetooth Vulnerability
There’s a new unpatched Bluetooth vulnerability:
The issue is with a protocol called Cross-Transport Key Derivation (or CTKD, for short). When, say, an iPhone is getting ready to pair up with Bluetooth-powered device, CTKD’s role is to set up two separate authentication keys for that phone: one for a “Bluetooth Low Energy” device, and one for a device using what’s known as the “Basic Rate/Enhanced Data Rate” standard. Different devices require different amounts of data — and battery power — from a phone. Being able to toggle between the standards needed for Bluetooth devices that take a ton of data (like a Chromecast), and those that require a bit less (like a smartwatch) is more efficient. Incidentally, it might also be less secure.
According to the researchers, if a phone supports both of those standards but doesn’t require some sort of authentication or permission on the user’s end, a hackery sort who’s within Bluetooth range can use its CTKD connection to derive its own competing key. With that connection, according to the researchers, this sort of erzatz authentication can also allow bad actors to weaken the encryption that these keys use in the first place — which can open its owner up to more attacks further down the road, or perform “man in the middle” style attacks that snoop on unprotected data being sent by the phone’s apps and services.
Patches are not immediately available at the time of writing. The only way to protect against BLURtooth attacks is to control the environment in which Bluetooth devices are paired, in order to prevent man-in-the-middle attacks, or pairings with rogue devices carried out via social engineering (tricking the human operator).
However, patches are expected to be available at one point. When they’ll be, they’ll most likely be integrated as firmware or operating system updates for Bluetooth capable devices.
The timeline for these updates is, for the moment, unclear, as device vendors and OS makers usually work on different timelines, and some may not prioritize security patches as others. The number of vulnerable devices is also unclear and hard to quantify.
Many Bluetooth devices can’t be patched.
Final note: this seems to be another example of simultaneous discovery:
According to the Bluetooth SIG, the BLURtooth attack was discovered independently by two groups of academics from the École Polytechnique Fédérale de Lausanne (EPFL) and Purdue University.
Two years ago, Apple abandoned its plan to encrypt iPhone backups in the iCloud in such a way that makes it impossible for it (or law enforcement) to decrypt the contents, a Reuters report claimed on Tuesday.
Based on information received by multiple unnamed FBI and Apple sources, the report says that the decision was made after Apple shared its plan for end-to-end encrypted iCloud backups with the FBI and the FBI objected to it.
According to the sources, Apple:
- Didn’t want to be attacked for or be seen as protecting criminals
- Was convinced by the FBI’s arguments (i.e., that being able to access the contents of iPhone backups in the iCloud is crucial to the success of thousands of investigations)
- Didn’t want to get into another court battle with the FBI over the matter or getting used as an excuse for new legislation against encryption.
End-to-end encrypted iCloud backups are not available, but…
Apple and the FBI declined to comment on these claims. Also, more importantly, and despite how it might seem initially, “Reuters could not determine why exactly Apple dropped the plan.”
Whether the decision was made entirely or partly because of the FBI’s objections is, therefore, unknown. One of the Reuters sources – a former Apple employee – said it was possible the encryption project was dropped for other reasons (e.g., to prevent customers being locked out of their backups because they forgot their passphrase).
Daring Fireball publisher John Gruber pointed out the same thing, and said that he “would find it less surprising to know that Apple acquiesced to the FBI’s request not to allow encrypted iCloud backups than that Apple briefed the FBI about such a plan before it was put in place.”
If you want to keep your backups for your eyes only
Whether Apple has canceled its plan to offer encrypted iCloud backups for good or just temporarily, the fact that users need to be aware that some of the information they back up in the iCloud can be decrypted by Apple and, consequently, be made available to law enforcement.
The data that is encrypted end-to-end (i.e., is protected with a key derived from information unique to the user’s device and their device passcode) includes things like the iCloud Keychain (which includes all of user’s saved accounts and passwords), Wi-Fi passwords and payment information.
Data that is encrypted in transit and on the server, but with a key known to Apple, includes the device’s backup, Safari history and bookmarks, photos, calendars, contacts, voice memos, and more.
And, while Messages in iCloud does use end-to-end encryption, if the user has iCloud Backup turned on, their backup includes a copy of the key protecting their Messages (so they can recover them if they lose access to iCloud Keychain and their trusted devices). That means that law enforcement can access them also, if Apple allows it.
In short: if you use an iPhone and you want all of your data to remain private and encrypted in a way that makes is impossible (or very, very difficult) for anyone to decrypt it, don’t back it up into iCloud. Instead, opt for an encrypted local backup on a Mac or PC through iTunes, choose a strong passphrase, and make sure to remember it.
Google users who opt for the Advanced Protection Program (APP) to secure their accounts are now able to use their iPhone as a security key.
About Google’s Advanced Protection Program
Google introduced the Advanced Protection Program in late 2017, to help high-risk users – journalists, human rights activists, IT admins, executives, etc. keep their Google accounts safe from targeted attacks.
APP is available to both consumer (Google Account) and enterprise users (G Suite).
It initially allowed users to make their accounts more secure by requiring them to have and use a physical security key to provide additional user verification during the login process.
In May 2019, Google made it possible to exchange the physical security key with one’s Android device. Now, finally, iPhone and iPad users can take advantage of that option, too.
Using iPhones for APP
Google considers security keys to be the strongest protection against account takeover attacks, whether they are performed by an automated bot, are bulk phishing attacks or extremely targeted (and tailored) attacks.
Making security more convenient is key to improving the adoption of security practices. By offering Android and iPhone/iPad users the option to use their devices as a security key, Google is making it easier for users to enroll into APP.
Let’s face it: we take our mobile phones with us everywhere and most of use are very conscientious about keeping the battery charged. Physical security keys, on the other hand:
- Are pricy for some
- May not be available for purchase to all who need them, and
- Are a piece of hardware that some might not want to have to keep track of and lug around all the time.
To be able to use one’s iPhone for APP, users have to have an iPhone running iOS 10+, the latest version of Google’s Smart Lock installed on it, and Bluetooth enabled.
The device through which they are signing into their account has to have the latest version of a compatible browser (e.g., Chrome), the latest version of a compatible OS (e.g., Chrome OS, Mac OS, or Windows 10), and Bluetooth enabled.
Google has provided this helpful guide on how to set up one’s phone’s built-in security key and use it.