Home > Uncategorized > We’re having the wrong conversation about Apple and the FBI

We’re having the wrong conversation about Apple and the FBI

March 2, 2016

We’re having the wrong conversation about privacy in the U.S.. The narrative is focused on how the bullies at the FBI are forcing the powers for good over at Apple to hand over data that they’d rather protect for the good of all.

Here’s the conversation I’d rather be having: why our government is not protecting our privacy. Instead, we are reduced to relying on an enormous, profit seeking corporation to make a stand for our rights. At the end of the day this argument dumbs down to “Apple good, government bad” and it’s far too easy and concedes far too much. We need to think harder and demand more.

Keep in mind a few things. Apple is not accountable to us. They are a private company. When people want to argue otherwise, they say that we “vote with our dollars” for Apple. But that just means that consumers affect gadget design, which includes safety and security features. At the end of the day, though, Apple is accountable to the laws of the land, and they will and have turned over all kind of private data about users when the law forces them to. They are not heroes, because they cannot be, even if they wanted to be.

I want to stop talking about Apple, and its operating systems, and so on. That’s all a sideshow. I want to talk about demanding a government that will acknowledge that its duty is to protect privacy while investigating risks. Right now the FBI is falling far short, trivializing the risk to the rest of us when backdoors are created and used at scale. They have made an internal calculation that the trade-offs are well worth the risks, without really having a conversation with the public in which they even measure the risks. And those risks are our risks.

And yes, that is complicated, nuanced, and there are plenty of conflicts of interest involved, which are hard to balance. But that’s the thing about governing in a democracy: we need to have the conversation, and the government needs to stay accountable to all of us. Let’s stop talking about Apple and start talking about democracy in the era of big data.

Categories: Uncategorized
  1. Michael Harrington
    March 2, 2016 at 8:22 am

    If we don’t fight back 1984 here we come

    Like

  2. March 2, 2016 at 9:26 am

    Agreed, but it’s actually a bit worse than you make out. Apple is not making a stand for privacy, but for security. Apple is defending the security of its product, not the privacy of its users.

    In this case, if Apple could share the shooter’s data without making their product fundamentally insecure by opening a hole for hackers to exploit, then of course Apple would do it.

    Of course privacy and security are related. Having a lock on the door means that what’s inside is your own business. But they are not the same: if Apple could share the key without breaking the lock for everyone, they would.

    Like

    • Auros
      March 3, 2016 at 10:01 pm

      I think you are right about this, Gabriel, but I don’t see that as a criticism of Apple… If Apple could actually unlock just this one phone, without compromising the security of all similar phones, it should (as long as it’s being presented with a warrant). That’s not so different from the super at an apartment building being shown a warrant for the search of an apartment, and opening it; doing so doesn’t alter the security status of the neighbors’ apartments.

      Apple wants to sell secure products. The government position appears to be that Apple should advertise its products as secure, while introducing a backdoor. That is, Apple should defraud its customers. That’s, shall we say, an interesting position for law enforcement to take. And that’s leaving aside the fact that it’s almost impossible to imagine a world where the government can use this backdoor on a regular basis — hundreds or thousands of times a year — and yet the backdoor does not get discovered and exploited by hackers and despots.

      Ultimately the question is whether it is legal to sell the public functional crypto systems. Or, put even more simply: Does the public have the right to use math? Because that’s all cryptography is. A properly constructed crypto system, operated by a user who knows what they’re doing (they set strong passwords — an entire sentence, or a series of seemingly random characters for which they have a mnemonic that they never share with anyone — rather than relying on something like just a four-digit code that can be figured out based on fingerprint smudges, etc), is basically unbreakable. You either have the key information, or you don’t get access. Most hacks these days, even where the users actually have crappy passwords, are actually accomplished by acquiring key info, or getting it reset to a key the hacker chooses, via social engineering. Companies don’t spend enough money to hire competent people to run their IT. They probably couldn’t if they wanted to, because we haven’t, as a society, invested enough to train all the competent IT people we need.

      I think if you care about privacy, or just some broad definition of liberty, it is disturbing to see law enforcement suggest that the general public should NOT have the right to apply publicly-accessible mathematical algorithms to their own personal data, and that no company should be allowed to sell products that help them do so. I’m hardly some techno-libertarian type. Silly Valley bitcoin nerds often annoy the heck out of me. But I think they’re 100% right about this particular case.

      Like

  3. mathematrucker
    March 2, 2016 at 10:15 am

    Searching on the word “privacy” at the page below gets two hits, neither of which have anything to do with Megan Smith’s job (both are links to Wikipedia’s privacy policy):

    https://en.wikipedia.org/wiki/Chief_Technology_Officer_of_the_United_States

    But the page lists “help keep the nation secure” as one of Ms. Smith’s three primary tasks.

    Like

  4. March 2, 2016 at 11:55 am

    Well said.

    We need to talk about how it is not society’s job to make the government’s law enforcement or prosecution job easier: in fact, our system of laws is specifically designed to make it harder on the government to prosecute, yet the argument we hear over and over is “this makes it very hard on us to do our job”.

    We need to talk about that right to privacy that, in the words of Justice Brandeis, is the fundamental right of every individual: the “right to be left alone”.

    And we need to recognize that when you put legal limits on how good privacy can be, all you do is make it easy to capture dumb criminals: because there is no reason why smart criminals will not create systems that are hard to hack just because they can’t buy them off the shelf. Making it hard for the ordinary citizen to have access to strong encryption is not the solution.

    Like

  5. rjh
    March 2, 2016 at 12:03 pm

    This case may well be decided on a completely different dimension. Try shifting the request to be “make and give us a police car”. This eliminates all the privacy, security, software, etc. issues.

    If the FBI asked Apple to “make and give us a police car”, everyone would understand Apple saying “no, that’s not our business”. There are no security or privacy issues involved. Now change the company, rather than request.

    If the FBI asks Ford to “make and give us a police car”, Ford says “yes, here’s our catalog” and they negotiate about price.

    The Ford price is set to be highly profitable, covering R&D, admin, sales, etc. If the FBI wants it for free, Ford says “no”.

    These issues of what business Apple is in, and how it prices services, may well be a significant part of court decisions. They may be sufficient that the courts avoid the difficult issues of privacy and security.

    Like

  6. March 2, 2016 at 2:10 pm

    The Federal responsibility for the security/privacy/confidentiality of digital communications lies by law with the NSA with the advice and CONSENT of NIST, who sets science-guided standards for technology.

    This is a major and ongoing failure of NSA, which installed widespread encryption backdoors circa 2004 and has not fully provided notifications of the existence of software vulnerabilities, but has instead suppressed them from correction and used them in intelligence activities — such as the Windows XP destruction of Iranian centrifuges.

    Like

  7. Laocoon
    March 7, 2016 at 10:40 am

    Isn’t the real security issue about the security of transactions as well as personal data?

    Think ApplePay. Think home security, as well as the larger issues that Apple’s Craig Federighi raises in the Washington Post today.
    https://www.washingtonpost.com/opinions/apple-vp-the-fbi-wants-to-roll-back-safeguards-that-keep-us-a-step-ahead-of-criminals/2016/03/06/cceb0622-e3d1-11e5-a6f3-21ccdbc5f74e_story.html

    I’m not a diehard Apple fan, but I have to side with them on strong encryption. I appreciate the difficult job the FBI has in keeping us safe, but there is a greater good of encryption. Michael Hayden has addressed this also, making a persuasive case of the benefits of encryption in the larger picture.

    My husband and I have lived at the intersection of business and tech for decades now. This is not an easy issue to reconcile, but we have to side with Apple on this one.

    Like

  1. No trackbacks yet.
Comments are closed.
%d bloggers like this: