Forget iPhone 13—Apple’s iPhone 14 Nightmare Just Got Worse – Forbes

Apple’s iPhone 13s may be flying from the shelves, but the world’s leading technology company is now caught in a nightmare, one that threatens to trash its polished marketing as 13 becomes 14 over the next year. Apple has built iPhone’s reputation on security and privacy, but those twin pillars are now under very serious threat.

This week we saw yet another emergency iPhone update, as yet another security vulnerability that “may have been actively exploited” was patched. This is becoming a worrying habit for Apple—why so many, why so often? All made much worse by Apple’s black-box iPhone mentality. Unlike Android, you can’t install software to make your iPhone safer. The device is too locked down, you are reliant on Apple.

Apple knows that Security updates will not undermine its burgeoning market position. Google is no better on the serious stuff, and suffers much more at the lower-end of the malware scale—even if not the headline-grabbing 47-times more that Apple claims.

But the knock to Apple’s privacy credentials that threatened to wobble its iPhone 13 launch could well undermine the warm and cosy feeling iPhone users are encouraged to adopt by Apple’s marketing machine. Apple needs this resolved before iPhone 14 launches next fall—it can’t run longer, but it currently has no good way to do so.

Apple’s ill-conceived plan to scan iPhones for child abuse material was attacked from the start. Apple has trapped itself in a privacy nightmare. The subsequent backtrack saw Apple keeping company with Facebook (WhatsApp/Facebook terms backtrack) and Google (FLoC backtrack). Not a pretty look for Apple.

Apple retrenched to buy some time to think. But in doing so, it has just turned an iPhone 13 problem into an iPhone 14 problem. Last month’s post-backlash protests might be dismissible, but the considered thinking of leading security and privacy experts and academics, published this week, is harder to brush aside.

Such client-side screening, the report warns, “creates serious security and privacy risks for all… [which] can result in a significant chilling effect on freedom of speech and, indeed, on democracy itself.” Apple has always fielded criticism for its kowtowing to China—we saw more of that this week. But this device scanning goes way beyond.

Apple’s real problem, though, is that it has clumsily boxed itself in. It has a user base that would likely accept cloud content screening, given its now an industry norm. But Apple threw such systems under a privacy bus, part of its foolish double-down to sell client-side screening ever harder in the face of the backlash before retreating.

“Introducing this powerful scanning technology on all user devices,” the report warns, “without fully understanding its vulnerabilities and thinking through the technical and policy consequences would be an extremely dangerous societal experiment.”

But Apple is still promising to “release these critically important child safety features,” after taking “additional time to collect input and make improvements.” And that leaves the company with an awkward choice—retreat further and opt for cloud screening after all or risk another backlash and deploy scanning client-side.

Apple would prefer the latter. But the new “Bugs in our Pockets” report sets out in detail how such systems “can fail… be evaded… and be abused,” all of which echo the warnings that flooded social media in the aftermath of Apple’s initial communications.

As the report makes clear, the risks that governments will force Apple to expand the system to comply with “local laws” are real, as are concerns that such systems will add further attack surfaces to the already bruised iPhone. But for Apple, the threat is to the “feel” of the iPhone. Mass surveillance just isn’t very Steve Jobs.

The authors of the report rightly point out that governments have already been pressing big tech to add such scanning to devices. Apple is a commercial entity that doesn’t operate in any kind of vacuum. Assurances that it will somehow ignore local laws once this tech is in place, are either hollow or naive.

“Instead of having targeted capabilities such as to wiretap communications with a warrant and to perform forensics on seized devices,” the authors warn, “the [intelligence/security] agencies’ direction of travel is the bulk scanning of everyone’s private data, all the time, without warrant or suspicion. That crosses a red line.”

Privacy advocates fear such systems because history tells us that they create backdoors that then open ever wider, that even the brightest technical designers and developers cannot outwit a world of bad actors, government agencies and vested interests.

“Apple has devoted a major engineering effort and employed top technical talent in an attempt to build a safe and secure client-side screening system,” this latest report acknowledges, “but it has still not produced a secure and trustworthy design.”

Or put more simply—even Apple can’t make this work.

“In a world where our personal information lies in bits carried on powerful communication and storage devices in our pockets, both technology and laws must be designed to protect our privacy and security, not intrude upon it. Robust protection requires technology and law to complement each other. Client-side scanning would gravely undermine this, making us all less safe and less secure.”

And so, to Apple’s self-imposed bear trap. If it wants to persist with client-side scanning, it has two choices, neither of which is likely to work out well.

What was a headline iOS 15 / iPhone 13 update could now become a midlife upgrade to both, raising the specter of an update that a large proportion of users decide not to install. But iPhone 14 would still launch with the update installed, running the risk of an escalating backlash through 2022 to next fall. The other option is for Apple to delay the update until iOS 16 / iPhone 14, but that simply shifts this year’s backlash forward 12-months and still undermines that next iPhone launch.

As such, Apple’s only real option is a full-scale backtrack, and it would win serious kudos by taking a leaf from WhatsApp’s book over its Facebook terms, admitting it got this one wrong, and discarding this idea for good. Off the back of such a move, the company could introduce cloud scanning based on user feedback, thus delivering on its promise to tackle abuses on its platform.

One advantage Apple does have, of course, is that the alternative is so much worse. Let’s be clear, the idea that a privacy scare would send iPhone users scurrying to Google’s Android, absent all the privacy protections Apple has added in recent years, is a non-starter. But Apple’s user base being retained by a negative rather than a positive isn’t a good look either—and that certainly isn’t very Steve Jobs.

“Privacy means people know what they’re signing up for, in plain English,” Jobs said in 2010. “Ask them. Ask them every time. Make them tell you to stop asking them, that they’re tired of you asking them.” Well, Apple may not have asked this time, but its users have still answered. And now even its most contorted assurances have been brushed aside by experts. If this really has been Tim Cook’s personal drive, then it really is time to ask what would Steve do?


Leave a Reply

This website uses cookies. By continuing to use this site, you accept our use of cookies.