Home > Business

Apple vs. FBI

What do you do when the hacker you need to defend against is your own government?

As a startup investor and former product designer, I sit on several boards of early-stage tech companies and I can report: Regardless of anyone’s perspective on politics or patriotism, the FBI’s tussle with Apple and the ensuing public debate have brought the issues of privacy and security to the front and center of the boardroom.
The FBI is trying to compel Apple by court order to exploit a backdoor into its iPhone operating system. That backdoor will allow the FBI to circumvent the security locks on the encrypted phone of one of the San Bernardino terrorists so that the Bureau can access whatever data he stored on his device.

The less reported details of the case are what is most curious—and perhaps why Apple is fighting the FBI and the U.S. government so vehemently. The FBI and cybersecurity community discovered that Apple had already coded a nifty backdoor into its iPhone operating system. Basically, the backdoor allowed Apple to update the underlying system software on users’ iPhones secretly and without telling them or asking for their password or permission.

That sounds sinister, but it wasn’t meant to be. Apple added this backdoor so that it could send fixes to damaged iPhones for customers who, for example, bricked their iPhone during a bad software update. Apple’s backdoor was meant to be a customer support tool to get broken iPhones working quickly.

In theory, though, this backdoor could also allow Apple to secretly upload changes to the operating system, or some nefarious user to monitor it to record the owners’ keystrokes while they logged into their online bank accounts. A hacker could even disable the iPhone’s key security feature that wipes the iPhone clean after 10 bad password entry attempts—precisely what the FBI would force Apple to do via the court order. The Bureau wants to break into the iPhone with a brute-force password attack.

That’s where the controversy begins: Apple created a backdoor, the FBI found out about it and now the agency is demanding the keys so it can break into just this one iPhone—or so it claims. The problem is that once the keys are out, lots of people can break into any other iPhone. The Chinese government, for example, could demand the same keys to access the iPhones of human rights dissidents it doesn’t like. Other U.S. law enforcement agencies could use the keys to access the iPhones of owners accused—or even just suspected—of crimes far less serious than terrorism.

This brings us back to what’s happening in my board meetings—and likely countless other board discussions across Silicon Valley and abroad.

Cybersecurity and privacy can no longer be an afterthought debated only by obscure programmers.

Top management must bring their companies’ security models and privacy policies front and center during the early design of products. As one CEO mentioned to me last week, “Our security threat models on how we protect our users and their data now have to include the U.S. government.”
So what does this mean for tech firms? For one thing, when you’re building a product, don’t design a backdoor into it, no matter how useful it might be for customer support—and if your current products have one, shut them down. If you’re building a WiFi router, a smart thermostat or an online video camera, to name a few very plausible examples, don’t add a support feature that lets you update the firmware without user permission. If Apple hadn’t built that backdoor into the iPhone operating system, the FBI couldn’t demand the keys to it. These are issues I’m now reviewing at the board level and asking founders and senior management to review as a top priority.

It took a terrible tragedy and a very public showdown between the U.S. government and Apple to bring these issues to the forefront of the privacy debate, so let’s not let the lessons from this go unheeded. Companies need to remove backdoors from their products and optimize for user privacy. The security threat model now includes the government, which can and will demand any secret keys that it discovers. When building whatever new products we hope will change the world, we have to remember this.

Related Articles