If there is one thing that is certain, if there is a useful technology invented that is supposed to benefit us, there is a corresponding negative usage that can and will be exploited. After the initial dopamine rush had worn off around Apple’s AirTags, people started waking up to the negative implications of a small, easy-to-conceal, wireless tracking device that utilizes one of the largest global networks in the world. Apple’s “Find My…” network is too useful to not be exploited, and the less ethical are already doing so.
What this means for you
Apple’s AirTags were initially created to track items that could be easily lost or stolen and ostensibly were made inconspicuous so that they weren’t unsightly and so thieves couldn’t easily find and discard the trackers. Once reports started flowing in of the “less orthodox” usage of AirTags, Apple immediately tried to get out in front of the problem by letting everyone know that AirTags themselves have unique, embedded serial numbers and their usage is tied to an Apple account – information they will surrender to law enforcement in a criminal investigation. But they glossed over something that more inventive hackers latched onto – what’s to stop someone from creating a “cloned” AirTag that simply bypasses Apple’s security measures? At the moment, nothing. Someone has already done so, and you can assume that Pandora’s box is not going to be closed any time soon without significant intervention from Apple.
Until that happens, you should get caught up on Apple’s lengthy advice on detecting and finding unwanted trackers. The article goes into great detail for Apple device users, so if you are an iPhone user, finding an unwanted Apple-made AirTag should be pretty straightforward (if not a wee bit unsettling). For the rest of us using Android devices, Apple has released an app called Tracker Detect (watch out for copy-cat apps!) that has to be activated manually. Not nearly as useful as its iOS counterpart, but at least they tried. If you’d like something a bit more robust and not funded by Apple, you can try AirGuard which was developed by a research team out of German university TU Darmstadt. I’ve tried both apps and while they appear to do no harm (other than possibly drain my battery faster), I can’t really verify that they work, as I apparently don’t have any unwanted trackers near me. Yay? Either way, if you suspect you are being digitally stalked, make sure you share your suspicions with your loved ones and authorities and get familiar with this site and its resources immediately!
Image by Thomas Wolter from Pixabay
We’ll keep it short and sweet this week. Earlier this year, an advanced form of spyware was discovered on a small group of Middle-Eastern journalists’ iPhones that was eventually traced back to a developer in Isreal called NSO Group. Purportedly designed for law enforcement agencies to combat terrorism, the spyware known as Pegasus appears to have been utilized by one or more government agencies to spy on a select group of iPhone users. At the time, it was unclear how the exploit was being deployed, so no defense or patch could be provided to stop Pegasus from being installed. After months of research, Canadian internet watchdog group Citizen Lab uncovered the flaw and announced it this week in the news, timed in concert with a security update from Apple that should be applied immediately to all iOS devices and MacOS devices.
What this means for you
If you have a late model iPhone, Mac computer, Apple Watch or iPad, check the settings immediately for any available updates and apply them as soon as you can get to a solid internet connection and have your device connected to a power source. The iOS version you are looking for is 14.8, and on Macbooks and iMacs it will be MacOS 11.6.
- Update your iPhone, iPad, or iPod touch – Apple Support
- Update your Apple Watch – Apple Support
- Update macOS on Mac – Apple Support
As of this writing, the actual number of people who have been impacted by this flaw and Pegasus is very small, but now that the actual flaw has been revealed, there is a possibility that others beside the NSO Group will attempt to take advantage of the window that is typically open while people get patched which can be days or even weeks. While Pegasus is designed for spying, there will surely be other malware types released to attempt to exploit this flaw that may be more straightforward in doing harm. Don’t be one of the ones caught sleeping on this update. Get patched now!
By the time you read this, Apple will be on day two of quarantining group calls in its video chat app, FaceTime. Why? Oh, how about a nasty eavesdropping bug that would allow callers to listen in on recipients before they pick up the call? Not necessarily ground-shaking in terms of espionage or cybercrime, but potentially embarrassing or even relationship-destroying, especially for an app that is heavily used for non-business calls. To add to the embarrassment of everyone, discovery of this bug is credited to young teenager trying to set up a group chat with his Fortnite friends. Thanks, Fortnite?
What this means for you
Probably not much, except if you use FaceTime for group chats which is now unavailable until Apple fixes the issue. At the moment, there is no firm ETA on the fix which “…will be released in a software update later this week,” per Apple’s official statement. Unfortunately, this isn’t the first security bug for FaceTime’s group chat feature which is not even a full year old. Last fall a security researcher was able to exploit a flaw in group chats to bypass the lock screen and view a user’s entire address book. Thanks to the internet and the always connected nature of iOS devices, bugs like these are typically fixed quickly, and unlike Android phones which suffer from a fractured operating system environment and inconsistent update policies controlled by competing manufacturers, Apple is able to react quickly to these situations. Score one for the fruit company!
With the hotly anticipated announcement of the next iPhone right around the corner, some parts of the technology media are once again navel-gazing about the world’s continuing love affair with Apple’s popular smartphone. It’s easy to see why so many are devoted consumers: the iPhone is a stellar example of a beautiful device that is highly functional. Long gone are the days where using high-tech tools were the sole domain of the unfashionably nerdy or productivity-obsessed workaholics, and there is no doubt who we have to thank for this change. But the eternal question is raised again: are we sacrificing function for form? Has the iPhone become of the stiletto heels of mobile devices?
Has Woo gone off the deep end?
Before you get the pitchforks and torches out, let me be clear: I’ve got nothing against stiletto heels. They are only one example in a sea of thousands that illustrated the “form over function” ideal, but they make for a handy and familiar analogy. Over time, the iPhone has become thinner because, let’s face it, chunky phones just aren’t “sexy” in today’s world. This had led to some interesting trends including antennae-gate, bend-gate, Touch disease, and the telling statistic that up to 1 in 4 of iPhones will suffer a cracked screen during their functional life, and that as many as 15% of all iPhone users are walking around with cracked screens rather than replacing them. What’s troubling is that an affordable, shatter-proof screen is readily available: use plastic instead of glass! But time and again, market research and testing shows that people don’t want plastic because it feels cheap, and right now, iPhones (and smartphones in general) are still very much a status symbol. Not that other smartphones aren’t seeing a similar trend in flawed design, but Apple is an easy, high-profile target that continues to market on its esthetics, and like a purebred pet with predisposition to genetic health issues, the iPhone could be evolving into a fragile, unsustainable extreme. How many more “flaw”-gates will people suffer through before demanding a more functional, practical smartphone? I still see a lot of stiletto heels out there.
For those of you who haven’t seen the Amazon Echo in action yet, it can be quite an eye opener. We are quickly converging on an environment that was not long ago considered science fiction. The Echo can quietly sit in the corner of your room, waiting for anyone in the family to give it a command, whether it’s to play some music, check the weather or order something from (surprise surprise!) Amazon. It’s also a perfect example of technology racing ahead of the law, and unlike the ongoing controversy around email and ECPA, the stakes are much higher because of who is allegedly at risk: our children. I’ll admit that this may seem a bit melodramatic, but the Guardian US isn’t wrong when pointing out that Echo and other products like it (think Apple’s Siri and Google Now) might actually be in violation of COPPA. For those of you in the room who are not lawyers, this is the Children’s Online Privacy & Protection Act of 1998 which, among many things, prohibits the recording and storage of a child’s voice without explicit permission of their parents or legal guardian.
What this means for you:
Even though I am a parent of young child for whom COPPA was enacted to protect, it hasn’t been too hard to suppress the urge to disconnect and discard every voice-activated, internet-connected device we own (which would be quite a few, including my daughter’s precious iPad). As with many technology items that dance on the edge of privacy invasion, I weigh the convenience and value they bring against the loss of privacy and security they inherently pose. I do see the problems technology like this presents: thousands (possibly millions) of parents set down products like Echo and Siri right in front of their children precisely because using them is simple and intuitive, and in the case of Echo, they are actually designed for use by everyone in the family. However, most people probably don’t realize that today’s voice recognition technology relies on pushing recordings of voice commands to the cloud where they are cataloged and processed to improve algorithms. Not only do those recordings store our children’s voices, they are also thick with meta data like marketing preferences, “Alexa, how much does that toy cost?” and location data, “Alexa, where is the nearest ice cream shop?” I’m pretty sure none of us gave explicit permission to Apple before allowing our kids to use Siri on their iPads and iPhones. If you were to adhere to a strict interpretation of COPPA, Apple, Amazon and Google (as well as many others) have an FTC violation on their hands that could cost them as much as $16,000 per incident.
As for your Echo (or smartphone or tablet) – only you should judge whether it’s an actual risk to your child. For the moment, the law is unclear, and knowing our government, likely to remain so long after the buying public makes up its own mind.
During it’s heyday, Apple’s QuickTime software was arguably hailed as the king of digital video. Though there were many competitors (remember Real video?) Apple’s codec reigned supreme in both editing as well as playback for many years, making Apple’s Mac computers the defacto standard in high-end digital video editing. Not unwisely, Apple realized the untapped market potential on the Windows side of the fence, and released a version of QuickTime for Windows 3.1 in 1996, and has steadily iterated on the platform through last year, though its use has declined steadily since the rise of streaming web video. Apparently usage has fallen off so dramatically that Apple recently announced it was no longer supporting the Windows version of QuickTime, hot on the heels of the announcement by US-CERT that the latest version of QuickTime for Windows had two significant zero-day vulnerabilities.
What this means for you:
Because I know you, I won’t bore you with the how the zero-days work, just know they are serious enough for the Department of Homeland Security to issue an alert. It’s not likely you will have Apple’s QuickTime software installed on your late-model business computer, but if you own an older computer at home (5-6 years old), and you’ve installed iTunes on that computer you probably have QuickTime is installed as it was bundled into iTunes as recently as 2011. If you happen to be in the relatively narrow demographic of digital video editor using Windows and Adobe’s Creative Cloud suite, you might also have QuickTime installed as it’s a requirement for certain video editing formats.
Either way, if you have it installed, remove QuickTime immediately. Apple has no plans to patch the vulnerabilities, and even though there are no known exploits in the wild as I write this, you can bet the high profile exposure has already triggered a wave of malicious programming. The easiest way to determine if QuickTime is installed is to go to Control Panel -> Programs & Features -> Uninstall Programs and scan through the list for “QuickTime” (not Apple QuickTime, like you might think). On older OSes you might have to look in Control Panel -> Add/Remove Programs. While you are there, you can look for other old programs you don’t use anymore and remove them in the spirit of spring cleaning.
In the latest dramatic chapter of the ongoing encryption battle between the FBI and Apple, the feds have admitted that they worsened their chances of ever finding out the contents of the San Bernardino shooter’s iPhone when they reset its associated iCloud password in a misguided attempt to access the locked device. According to Apple, prior to that reset, the FBI may have been able to gain access to the device without Apple having to provide a controversial backdoor to its otherwise very secure smartphones. On top of the FBI’s blunder and lack of understanding of Apple’s iPhone security, it’s also clear that several members of the House Judiciary Committee leading the hearings on this controversy are also poorly versed in how smartphone security works. To be fair to everyone, Apple’s iCloud system is arcane even to me, so it’s easy to see how someone unfamiliar with the system could make this mistake.
What this means for you:
Making fun of government officials being ignorant about high tech subjects is like shooting fish in a barrel. The “series of tubes” analogy used by Senator Ted Stevens is just one of many examples of US lawmakers struggling to understand admittedly complex technologies like the internet and encryption. Back then (10 years ago!) it might have been acceptable to dismiss their technology naivety as understandable – after all they are congress people, not IT consultants. But now, in an increasingly technology-permeated society, their ignorance or willful disregard of technology can lead to very bad decisions that have widespread and long-lasting consequences. This is just as applicable to your personal and workplace tech. While it’s impossible to be an expert on everything, if you rely on technology for critical business operations, you should have more than a basic understanding of how to turn it on and off. At minimum you should know what risks come with that technology, and if you cannot claim to be an expert in the technology in question, you should always consult with an experienced technology professional before making game-changing decisions.
Image courtesy of Stuart Miles at FreeDigitalPhotos.net
Apple made a big splash last week when CEO Tim Cook published an open letter in response to the FBI’s request and subsequent court order to hack the iPhone of the primary assailant in December 2015’s San Bernadino mass shooting. As one might expect, Mr. Cook basically told the government that they would not comply, and fortunately, they might be the one company that could afford to fight this battle in the courts. Though the tech industry has typically maintained a similar stance on device encryption, even the most staunch champions of digital privacy such as Google and Twitter have had suprisingly muted responses to the growing battle. Also revealing is a recent Pew poll that suggests while the tech industry may be largely united on device encryption and government backdoors, the American public isn’t quite sure what to think about this complex issue.
What this means for you:
Late model iPhones ship with encryption enabled by default, and as long as you enable some form of authentication on your device, the data on that device will only be accessible if you unlock it. Law enforcement can’t break the encryption, and Apple, by it’s own admission, cannot decrypt your phone’s contents with out the proper authentication, even if the phone owner asks them to do so. If someone tries too many times to guess your pin, the device will be automatically wiped – no intervention from Apple or your carrier is required. The FBI is demanding Apple create a way for them to unlock the iPhone of the San Bernadino shooter, which if Apple were to actually accomplish such a feat, could theoretically allow anyone with possession of this backdoor to decrypt any iPhone protected by similar technology. Like the atomic bomb, the development of this backdoor cannot be unmade, nor will it remain only in the hands of the “righteous”. While the data on the SB shooter’s phone may prove useful in providing some closure to the incident and may even help further other domestic terror investigations, it’s easy to see that the FBI means for this case to set a precedent that will give them unfettered access to an area that has traditionally been protected, both by law and by technology.
Apple is infamous for it’s stringent and sometimes odd vetting process for iOS apps, but it has purportedly kept iPhone and iPad users relatively safe from the malware that has plagued the Android ecosystem for years. Unfortunately, they can no longer wear that badge with pride anymore, as dozens (possibly hundreds) of apps written by Chinese developers and distributed through the official Apple App Store have been found to be infected with malware that can cause serious security problems for the affected device. Before you get up in arms about the brazen escalation of Sino-American cyber-hostilities, security analysts believe that the infected apps weren’t purposefully compromised, but were caused by Chinese app developers using an infected version of Apple’s coding framework, Xcode to build or update their apps. These apps were then submitted and, upon passing through Apple’s security screening, distributed in both the Chinese and American App Stores to upwards of hundreds of millions of users.
What this means for you:
Unless you make a habit of installing Chinese iOS apps you probably aren’t directly affected by this. Check this list, and if you did install one of the affected apps remove it or update it immediately, and change your Apple Cloud password and any other passwords you might have used while the infected app was installed on your device. For the rest of us that aren’t impacted, this particular failure illustrates two important points about security:
- No security system or process is infalliable. Apple’s fall from grace in this regard was only a matter of time. Every good security plan should include a failure contingency. In Apple’s case, they know exactly who installed what apps and plan to notify all affected customers.
- The use of the compromised Xcode framework was traced to many developers using a non-official download source to retrieve the code, which is very large (3gb) and is very to slow to download in China from Apple’s servers. Rather than being patient/diligent, Chinese programmers used local, unofficial repositories hosting malware infected versions of Xcode. Always confirm your source (whether reading email or downloading software) before clicking that link!