TechMar 28, 2016·8 min readAnalysis

The Lock Nobody Picked

GlitchBy Glitch
historicalcryptography

It's over.

The FBI announced today that it has unlocked the San Bernardino shooter's iPhone 5c without Apple's help. The Department of Justice has withdrawn its case. The hearing nobody wanted to have won't happen. The question nobody wanted answered will remain safely unanswered.

Both sides are declaring victory. Both sides are right. And both sides just guaranteed this fight will happen again.

The Six Weeks That Changed Nothing

Here's the timeline of the most consequential non-event in technology law:

December 2, 2015: Syed Rizwan Farook and Tashfeen Malik kill fourteen people and injure twenty-two at a holiday party in San Bernardino, California. Among the evidence recovered: Farook's work-issued iPhone 5c, locked with a passcode, running iOS 9.

February 16, 2016: A federal magistrate orders Apple to create custom software — a modified version of iOS that would disable the security feature wiping the phone after ten failed passcode attempts, allowing the FBI to brute-force the code. Tim Cook responds the same day with an open letter to customers. The government, he writes, is demanding Apple build "a master key, capable of opening hundreds of millions of locks." He calls the implications "chilling."

February 25: Apple formally moves to vacate the order. The company's argument is specific and worth hearing precisely: this isn't about one phone. Building what the FBI wants creates a tool that, once it exists, can never be unbuilt.

March 21: One day before the scheduled hearing, the DOJ asks for a delay. An "outside party," the government says, has demonstrated a possible method for unlocking the phone without Apple's assistance.

March 28 — today: The FBI confirms it has accessed the phone's data. Case dropped.

Six weeks of legal argument. Thousands of column inches. Dozens of amicus briefs from companies, civil liberties organizations, former intelligence officials, and cryptographers. The entire technology industry forced to take sides on the most fundamental question of the digital age: can the government compel a private company to build a weapon against its own customers?

The answer, as of this afternoon: we still don't know.

What Apple Won (and Didn't)

Apple avoids the precedent. No court has ruled that a technology company can be forced to create new software that undermines its own security features. Tim Cook's letter stands as a public commitment — Apple will fight backdoors. The brand wins. The principle survives.

But the principle survives only because it was never tested. The government withdrew before a ruling. There is no binding decision. The next time law enforcement wants a similar order — and there will be a next time — they'll argue from scratch. Apple will have to fight the same fight again, in a different courtroom, with a different judge, on different facts.

The legal ground Apple stands on today is exactly the same ground it stood on six weeks ago: unresolved.

What the FBI Won (and Didn't)

The FBI got into the phone. Mission accomplished, technically. The FBI has not identified the third party, disclosed the method, or revealed its cost. The data is being reviewed.

But the FBI's strategic position just got worse, not better. By finding an alternative method, the Bureau undermined its own central argument — that Apple's cooperation was necessary. The next time the FBI asks a court to compel a company to build decryption tools, the first question from any judge will be: why can't you just find another third party?

More than that, the FBI paid for a hack whose technical details it may not fully possess. It bought a solution instead of building a legal framework. That's not a capability; it's a transaction. The next phone, the next encryption standard, the next iOS version will require another transaction.

The FBI chose the lock-pick over the courthouse. That tells you everything about what they actually wanted.

The Deliberate Void

Here's what nobody is saying publicly: both sides preferred this outcome.

Apple didn't want a ruling either. A favorable precedent would be nice in theory, but the risk of an unfavorable one was existential. If a court ruled that the All Writs Act — a 227-year-old statute about compelling "reasonable" assistance — gives the government power to conscript technology companies as code writers, the implications would extend far beyond one iPhone. Every encrypted device, every secure messaging platform, every company that builds security features would exist under the shadow of potential compulsion.

Better, from Apple's perspective, to preserve ambiguity. Ambiguity means each case is fought fresh. Each case requires new arguments, new political calculations, new public pressure campaigns. Apple can fight retail battles indefinitely. A wholesale loss would be permanent.

The FBI, for its part, didn't want to lose in court either. A ruling that the All Writs Act doesn't extend to compelling software creation would shut the door not just on this phone but on the entire legal theory. Law enforcement has been pushing for backdoor mandates for two decades — the Clipper Chip in the '90s, CALEA for telecommunications, the periodic "going dark" campaigns every few years. Every legislative effort has failed. A judicial precedent against compelled decryption would be worse than no precedent at all.

So both sides chose the void. Not because the question doesn't matter — it's arguably the most important unresolved question in technology law — but because any answer carries more risk than continued uncertainty.

This is a pattern that shows up whenever the stakes are high enough and the possible outcomes are binary enough: institutions prefer the recurring fight to the definitive loss.

The Lock-Pick Economy

The anonymous third party deserves more attention than it's getting.

The FBI won't identify them. Won't say what the method was. Won't confirm whether the tool has been shared with other agencies. This opacity is its own story.

A shadow market for device vulnerabilities has been growing for years. Companies and researchers buy and sell zero-day exploits — security flaws unknown to the device manufacturer — to the highest bidder. Governments are major customers. The economics are straightforward: if you can't compel the lock-maker to give you a key, you pay someone to pick the lock.

This market thrives on the same ambiguity that both Apple and the FBI prefer. As long as there's no legal framework for compelled decryption, law enforcement will keep buying hacks on the open market. As long as law enforcement keeps buying hacks, it has less incentive to push for a legal framework. The lack of resolution feeds the market. The market reduces the pressure for resolution.

It's a closed loop. And inside that loop, the vulnerabilities that governments purchase to access criminal phones are the same vulnerabilities that criminals can exploit to access everyone else's. Every zero-day the FBI buys is a zero-day that remains unpatched. The security of every iPhone user is marginally weaker today because the FBI preferred to buy an exploit rather than litigate a principle.

The FBI hasn't disclosed whether the tool works on newer models. It doesn't matter. The market for the next exploit is already open.

The Pattern That Persists

The encryption debate will return. It returns every time there's a terrorist attack, a mass shooting, a child exploitation case where encrypted communications play a role. Each time, law enforcement will argue that the technology industry is placing profits above public safety. Each time, the industry will argue that weakening encryption for the government weakens it for everyone — including the criminals the government is trying to catch.

Each time, both sides will hope the fight resolves itself before it reaches a courtroom. Because a courtroom produces answers. And answers create losers.

The Fourth Amendment says the government can search with a warrant. Modern encryption says some searches are mathematically impossible without the key. These two facts are on a collision course that every institution with a stake in the outcome — the courts, Congress, the executive branch, the technology industry — has spent twenty years carefully steering around.

The Clipper Chip fight in the 1990s ended not with a resolution but with a market decision: consumers and businesses chose strong encryption, and the government stopped pushing. The going-dark debate in the early 2000s produced reports, hearings, and hand-wringing but no legislation. The San Bernardino case had the gravity to force a resolution. It arrived with everything a test case needs: terrorism, dead Americans, a locked phone, and a set of facts more sympathetic to law enforcement than anything a civil liberties lawyer could have nightmared.

And still, nobody wanted the answer.

What Didn't Happen

Today should have been the day. The San Bernardino case had every ingredient for a landmark ruling — the kind that gets cited for decades, the kind law students memorize, the kind that actually resolves something.

Instead, we got a lock-pick and a shrug.

The constitutional question — where the Fourth Amendment meets strong encryption — remains exactly where it was before fourteen people died in San Bernardino. The encryption on your phone is no more or less legally protected today than it was in December. The government's power to compel your phone's manufacturer to break that encryption is no more or less defined.

Nothing has been resolved. The tension holds. The fight will recur — with the same arguments, the same stakes, and the same institutional incentives to avoid a definitive answer.

The only thing that changed today is that the FBI has one phone's data and a million-dollar bill. The rest of us have one fewer chance at clarity, and a lock that nobody picked — because picking it was always easier than deciding who should hold the key.

Sources:

Source: FBI / Apple / Wikipedia