Brought to you by:

Ivey Publishing

Building a "Backdoor" to the iPhone: An Ethical Dilemma

By: Tulsi Jayakumar, Surya Tahora

In February 2016, Tim Cook, Apple's chief executive officer, challenged a U.S. Federal Court order for Apple to assist the Federal Bureau of Investigation (FBI) in a case involving suspected…

  • Length: 10 page(s)
  • Publication Date: Apr 28, 2016
  • Discipline: Business Ethics
  • Product #: W16245-PDF-ENG

What's included:

  • Teaching Note
  • Educator Copy

$4.95 per student

degree granting course

$8.95 per student

non-degree granting course

Get access to this material, plus much more with a free Educator Account:

  • Access to world-famous HBS cases
  • Up to 60% off materials for your students
  • Resources for teaching online
  • Tips and reviews from other Educators

Already registered? Sign in

  • Student Registration
  • Non-Academic Registration
  • Included Materials

In February 2016, Tim Cook, Apple's chief executive officer, challenged a U.S. Federal Court order for Apple to assist the Federal Bureau of Investigation (FBI) in a case involving suspected international terrorism. The government wanted Apple to provide the FBI with access to encrypted data on an Apple product, the iPhone. Cook's refusal to acquiesce to the government's demands drew strong public debate, pitting the proponents of national security against those in favour of customers' digital privacy and security. The case invoked an ethical dilemma faced by management in issues involving right-versus-right decisions. Which right should Cook choose? What are the ethical dilemmas involved in making this decision? How should Cook resolve the dilemma?

Tulsi Jayakumar is affiliated with SP Jain Institute of Management & Research.

Learning Objectives

This case can be taught in a 90-minute session of a business ethics course in a postgraduate or executive MBA program. It may also be used in an information management course to teach a module on ethics in information management, focusing on the moral and ethical dimensions of information handling and use, including gatekeeping. The case will help students to: -Distinguish between various kinds of executive management decisions: right-versus-wrong compared to right-versus-right. -Recognize and understand the moral dilemmas facing management involving right-versus-right decisions or "the-dirty hands problem." -Understand the frameworks used in developing practical approaches to resolving these dilemmas.

Apr 28, 2016 (Revised: Aug 16, 2017)

Discipline:

Business Ethics

Ivey Publishing

W16245-PDF-ENG

We use cookies to understand how you use our site and to improve your experience, including personalizing content. Learn More . By continuing to use our site, you accept our use of cookies and revised Privacy Policy .

apple backdoor case study

McCombs School of Business

  • Español ( Spanish )

Videos Concepts Unwrapped View All 36 short illustrated videos explain behavioral ethics concepts and basic ethics principles. Concepts Unwrapped: Sports Edition View All 10 short videos introduce athletes to behavioral ethics concepts. Ethics Defined (Glossary) View All 58 animated videos - 1 to 2 minutes each - define key ethics terms and concepts. Ethics in Focus View All One-of-a-kind videos highlight the ethical aspects of current and historical subjects. Giving Voice To Values View All Eight short videos present the 7 principles of values-driven leadership from Gentile's Giving Voice to Values. In It To Win View All A documentary and six short videos reveal the behavioral ethics biases in super-lobbyist Jack Abramoff's story. Scandals Illustrated View All 30 videos - one minute each - introduce newsworthy scandals with ethical insights and case studies. Video Series

Case Study UT Star Icon

The FBI & Apple Security vs. Privacy

How can tech companies and government organizations strike a balance between maintaining national security and protecting user privacy?

apple backdoor case study

In December 2015, the FBI attained the iPhone of one of the shooters in an ISIS-inspired terrorist attack that killed 14 people in San Bernardino, California. As part of the investigation, the FBI attempted to gain access to the data stored on the phone but was unable to penetrate its encryption software. Lawyers for the Obama administration approached Apple for assistance with unlocking the device, but negotiations soon broke down. The Justice Department then obtained a court order compelling Apple to help the FBI unlock the phone. Apple CEO, Timothy Cook, publicly challenged the court in an open letter, sparking an intense debate over the balance between maintaining national security and protecting user privacy.

Apple and its supporters, including top technology companies such as Google and Facebook, made the case on several fronts that the court order threatened the privacy of all individuals. First, according to Apple, the order effectively required the company to write code, violating its First Amendment right to free speech by forcing the company to “say” something it did not want to say. Previous court cases had already established computer code as legally protected speech. Second, such a backdoor, once created, could fall into the wrong hands and threaten the privacy of all iPhone owners. Finally, it would set a dangerous precedent; law enforcement could repeatedly require businesses such as Apple to assist in criminal investigations, effectively making technology companies an agent of government.

Representatives from both sides of the political aisle offered several arguments in favor of the Justice Department’s efforts and against Apple’s stance. Their central claim was that the U.S. legal system establishes constraints on the government’s access to private information which prevent abuse of search and surveillance powers. At the same time, the law still allows authorities to gain access to information that facilitates prevention and prosecution of criminal activities, from terrorism to drug trafficking to child pornography. Critics of Apple also rejected the slippery slope argument on the grounds that, if Apple cooperated, it could safeguard the code it created and keep it out of the hands of others, including bad actors such as terrorists or criminal groups. Moreover, Apple was accused of being too interested in protecting its brand, and even unpatriotic for refusing to comply with the court order.

Ultimately, the FBI dropped the case because it was able to circumvent the encryption on the iPhone without Apple’s help.

Discussion Questions

1. What harms are potentially produced by the FBI’s demand that Apple help it open an iPhone? What harms are potentially produced by Apple’s refusal to help the FBI?

2. Do you think Apple had a moral obligation to help the FBI open the iPhone in this case because it involved terrorism and a mass shooting? What if the case involved a different type of criminal activity instead, such as drug trafficking? Explain your reasoning.

3. Apple argued that helping to open one iPhone would produce code that could be used to make private information on all iPhones vulnerable, not only to the American government but also to other foreign governments and criminal elements. Do you agree with Apple’s “slippery slope” argument? Does avoiding these harms provide adequate justification for Apple’s refusal to open the phone, even if it could reveal crucial information on the terrorist shooting?

4. Politicians from across the political spectrum, including President Obama and Senator Ted Cruz, argued that technology preventing government access to information should not exist. Do you agree with this limit on personal privacy? Why or why not?

5. Ultimately, the FBI gained access to the iPhone in question without the help of Apple. Does this development change your assessment of the ethical dimensions of Apple’s refusal to help the FBI? Why or why not? Should the FBI share information on how it opened the iPhone with Apple so that it can patch the vulnerability? Explain your reasoning.

Related Videos

Incrementalism

Incrementalism

Referred to as the slippery slope, incrementalism describes how we unconsciously lower our ethical standards over time through small changes in behavior.

Bibliography

Apple Fights Order to Unlock San Bernardino Gunman’s iPhone http://www.nytimes.com/2016/02/18/technology/apple-timothy-cook-fbi-san-bernardino.html

How they line up on Apple vs. the FBI https://www.washingtonpost.com/graphics/business/fbi-apple/

Why Apple Is Right to Challenge an Order to Help the F.B.I. http://www.nytimes.com/2016/02/19/opinion/why-apple-is-right-to-challenge-an-order-to-help-the-fbi.html

Apple’s Rotten Core: CEO Tim Cook’s Case for Not Aiding the FBI’s Antiterror Effort Looks Worse than Ever http://www.wsj.com/articles/apples-rotten-core-1456696736

Obama, at South by Southwest, Calls for Law Enforcement Access in Encryption Fight http://www.nytimes.com/2016/03/12/us/politics/obama-heads-to-south-by-southwest-festival-to-talk-about-technology.html

U.S. Says It Has Unlocked iPhone Without Apple http://www.nytimes.com/2016/03/29/technology/apple-iphone-fbi-justice-department-case.html

Stay Informed

Support our work.

Loyola University > Center for Digital Ethics & Policy > Research & Initiatives > Essays > Archive > 2016 > Balancing Security and Privacy in the Age of Encryption: Apple v. FBI

Balancing security and privacy in the age of encryption: apple v. fbi, june 6, 2016.

The San Bernardino attack that resulted in the deaths of 14 people last December continues to evolve into the polarizing yet familiar battle over the balance between  privacy and national security . For those who have lost track of how it all started, the story began when the FBI was unable to unlock an iPhone belonging to one of the attackers, Syed Rizwan Farook, and approached Apple for assistance. Drama ensued as Apple refused to help the FBI break into the phone, believing that the methodology it was asked to utilize was unwarranted and threatening to public security. In what many have argued is an unethical, unprecedented request, the FBI ordered Apple to create software that would disable privacy settings used in select iPhones models. In addition to existing disputes over the acceptable extent of access to private information, the order gave rise to a new question: Does the FBI have the right to demand security backdoors that could compromise the safety of uninvolved civilians?

The trouble began soon after the FBI found that it could not unlock Syed’s phone, which was locked with a four-digit code set to erase the phone’s contents after ten incorrect password attempts. The task was further complicated by a setting that increased time increments between failed password trials, a particularly frustrating problem in crimes when time is of the essence. In fact, Apple’s iPhone encryption software was so advanced that the company itself claimed it did not possess the technology needed to unlock it. Frustrated with Apple’s refusal to comply with its requests, the FBI asked Magistrate Sheri Pym to issue a court order demanding that Apple create a new operating system to allow it to bypass security measures.

The order was unique both because it asked for nonexistent software and because it requested a security ‘backdoor’ that could be used to unlock myriad devices. So was it ethical, not to mention legal, for the FBI to ask for software that had the potential to override broadly-applicably security measures? According to Apple, the answer is a big, fat, thespian no. Apple not only refused to comply but also published an  open letter  to the public, advising people of the ‘chilling’ implications of a security backdoor, writing that, “this demand would undermine the very freedoms and liberty our government is meant to protect.” Apple warned that the technology could be detrimental if misused, stating: “In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession ... while the government may argue that its use would be limited to this case, there is no way to guarantee such control.” The letter went on to outline several alarming scenarios that could result from giving the government access to this technology. Among them were the right to ask for software that intercepts texts or photos, health records, financial data and locations.

Though the letter was a bit artful, it raised important questions that deserve careful consideration. For one, the request for nonexistent software could set a legal precedent for permitting additional nonstandard, privacy-compromising demands. Apple’s fear stemmed in part from the approach the FBI took to seeking out iPhone contents. Rather than issuing a standard subpoena for information found on one device, the government requested a court order under the  All Writs Act , which allows federal courts to issue all necessary or appropriate legal writs (i.e., court orders) compelling citizens to undertake certain actions as long as it is necessary and appropriate. The Act is a component of the Judiciary Act of 1789, and its creators could not have possibly predicted cell phones, let alone the links between individual phone software and security of technologies belonging to the greater public. Because the Act is so broad, it could, in theory, be applied to more extensive requests for technology that would jeopardize our privacy.

Whether major fears about abuses of power are symptomatic of public paranoia or forward-thinking dedications to ensuring public security is debatable. The government’s stance on the issue is not. Soon after the open letter was published, the FBI filed a motion to compel Apple to comply with the court order and accused the company of misrepresenting facts for marketing purposes. Government  prosecutors wrote : “Rather than assist the effort to fully investigate a deadly terrorist attack by obeying this Court’s Order of February 16, 2016, Apple has responded by publicly repudiating that order … The Order does not, as Apple’s public statement alleges, require Apple to create or provide a ‘back door’ to every iPhone; it does not provide ‘hackers and criminals’ access to iPhones … It does not give the government ‘the power to reach into anyone’s device without a warrant or court authorization …” The motion also goes on to imply that Apple misled the public about the dangers of the All Writs Act, claiming that Apple previously complied with the Act, and use of the law for such purposes was not unprecedented.

While Apple and the FBI clearly stand on opposite sides of the argument, the public’s opinions on whether the government is dangerously overstepping boundaries are mixed. Based on a March  phone poll  of over 1,000 individuals, CBS revealed that 50 percent of those polled thought that Apple should unlock the iPhone, and 45 percent thought it should refute the order. Despite the varied results, eight in 10 respondents still believed that it was at least somewhat likely a decision to unlock the phone could set a legal precedent for mandates to unlock additional devices in the future. In other words, a belief that the government will continue to push privacy boundaries are widespread

Luckily for the FBI, it is unlikely that the bureau will be forced to defend itself on a public stage. Nor will Apple be lucky enough to testify in court, acting as a stalwart battling the government to protect collective security. What could have set the stage for a Hollywood movie has begun to devolve into a background narrative. After asking for a delay on its court date with Apple, the FBI fully retracted its demands. Instead of fighting the tech giant, it secured the services of professional hackers who were able to find and expose flaws in the iPhone’s security system, allowing the government to unlock the phone without clearing its contents.

Not only has the dramatic storyline come to an abrupt halt, the ball is back in the FBI’s court. Now that it possesses information about Apple’s security flaws, it has the opportunity to minimize accusations about unethical intentions to infiltrate additional devices. If the FBI chooses to provide Apple with details about its operating system failings, the bureau may qualm some public suspicions, but it will also risk losing valuable information that could be utilized for future searches. The path it chooses to take will likely be determined by the  White House  in the coming weeks, but inevitably, uncertainties over its intentions will remain intact.

Paulina Haselhorst  was a writer and editor for  AnswersMedia and the director of content for Scholarships.com . She received her MA in history from Loyola University Chicago and a BA from the University of Illinois at Urbana-Champaign. You can contact Paulina at  [email protected] .

Research & Initiatives

Return to top.

  • Support LUC
  • Directories
  • Symposia Archive
  • Upcoming Events
  • Past Events
  • Publications
  • CDEP in the News

© Copyright & Disclaimer 2024

To revisit this article, visit My Profile, then View saved stories .

  • Backchannel
  • Newsletters
  • WIRED Insider
  • WIRED Consulting

Leander Kahney

The FBI Wanted a Back Door to the iPhone. Tim Cook Said No

In 2016, Tim Cook fought the law—and won.

Late in the afternoon of Tuesday, February 16, 2016, Cook and several lieutenants gathered in the “junior boardroom” on the executive floor at One Infinite Loop, Apple’s old headquarters. The company had just received a writ from a US magistrate ordering it to make specialized software that would allow the FBI to unlock an iPhone used by Syed Farook, a suspect in the San Bernardino shooting in December 2015 that left 14 people dead.

The iPhone was locked with a four-digit passcode that the FBI had been unable to crack. The FBI wanted Apple to create a special version of iOS that would accept an unlimited combination of passwords electronically, until the right one was found. The new iOS could be side-loaded onto the iPhone, leaving the data intact.

But Apple had refused. Cook and his team were convinced that a new unlocked version of iOS would be very, very dangerous. It could be misused, leaked, or stolen, and once in the wild, it could never be retrieved. It could potentially undermine the security of hundreds of millions of Apple users.

Cover of the Tim Cook book with a portrait of Tim Cook.

In the boardroom, Cook and his team went through the writ line by line. They needed to decide what Apple’s legal position was going to be and figure out how long they had to respond. It was a stressful, high-stakes meeting. Apple was given no warning about the writ, even though Cook, Apple’s top lawyer, Bruce Sewell, and others had been actively speaking about the case to law enforcement for weeks.

The writ “was not a simple request for assistance in a criminal case,” explained Sewell. “It was a forty-two-page pleading by the government that started out with this litany of the horrible things that had been done in San Bernardino. And then this . . . somewhat biased litany of all the times that Apple had said no to what were portrayed as very reasonable requests. So this was what, in the law, we call a speaking complaint. It was meant to from day one tell a story . . . that would get the public against Apple.”

The team came to the conclusion that the judge’s order was a PR move—a very public arm twisting to pressure Apple into complying with the FBI’s demands—and that it could be serious trouble for the company. Apple “is a famous, incredibly powerful consumer brand and we are going to be standing up against the FBI and saying in effect, ‘No, we’re not going to give you the thing that you’re looking for to try to deal with this terrorist threat,’” said Sewell.

They knew that they had to respond immediately. The writ would dominate the next day’s news, and Apple had to have a response. “Tim knew that this was a massive decision on his part,” Sewell said. It was a big moment, “a bet-the-company kind of decision.” Cook and the team stayed up all night—a straight 16 hours—working on their response. Cook already knew his position—Apple would refuse—but he wanted to know all the angles: What was Apple’s legal position? What was its legal obligation? Was this the right response? How should it sound? How should it read? What was the right tone?

Your Bike Tires Are Too Skinny. Riding on Fat, Supple Tires Is Just Better

By Matt Kamen

The End of ‘iPhone’

By Carlton Reid

Don’t Believe the Biggest Myth About Heat Pumps

By Matt Simon

Cook was very concerned about the public’s reaction and knew that one of the outcomes of his action could be that Apple would be accused of siding with terrorists. What kind of company wouldn’t help the FBI in a terrorist investigation? From a public relations standpoint, Apple had always been on the side of privacy advocates and civil libertarians. This case put the company unexpectedly on the side of a terrorist. This was brand-new territory, and Cook had to figure out how to navigate it. He had to show the world that he was advocating for user privacy rather than supporting terrorism.

At 4:30 a.m., just in time for the morning news cycle on the East Coast, Cook published an open letter to Apple customers explaining why the company would be opposing the ruling, which “threatens the security of our customers.” He referenced the danger that could come from the government having too much power: “The implications of the government’s demands are chilling,” he wrote. “If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data.”

Apple had been working with the FBI to try to unlock the phone, providing data and making engineers available, Cook explained. “But now the US government has asked us for something we simply do not have, and something we consider too dangerous to create . . . a backdoor to the iPhone.” He continued, “In the wrong hands, this software—which does not exist today—would have the potential to unlock any iPhone in someone’s physical possession.” This could have potentially disastrous consequences, leaving users powerless to stop any unwanted invasion of privacy. “The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.”

Cook then accused the government of trying to force Apple “to hack our own users and undermine decades of security advancements that protect our customers . . . from sophisticated hackers and cybercriminals.” It would be a slippery slope from there. The government could then demand that Apple build surveillance software to intercept messages, access health records or financial data, or track users’ locations. Cook needed to draw a line. He believed the FBI’s intentions were good, but it was his responsibility to protect Apple users. “We can find no precedent for an American company being forced to expose its customers to a greater risk of attack,” he wrote. Though it was difficult for him to resist orders from the US government, and he knew he’d face backlash, he needed to take a stand.

The magistrate’s order thrust into the spotlight a long-running debate Apple had been having with the authorities about encryption. Apple and the government had been at odds for more than a year, since the debut of Apple’s encrypted operating system, iOS 8, in late 2014.

iOS 8 added much stronger encryption than had been seen before in smartphones. It encrypted all the user’s data—phone call records, messages, photos, contacts, and so on—with the user’s passcode. The encryption was so strong, not even Apple could break it. Security on earlier devices was much weaker, and there were various ways to break into them, but Apple could no longer access locked devices running iOS 8, even if law enforcement had a valid warrant. “Unlike our competitors, Apple cannot bypass your passcode and therefore cannot access this data,” the company wrote on its website. “So it’s not technically feasible for us to respond to government warrants for the extraction of this data from devices in their possession running iOS 8.”

The update had repeatedly stymied investigators. At the New York press event two days after Cook’s letter on San Bernardino, the authorities said that they had been locked out of 175 iPhones in cases they were pursuing. For more than a year, law enforcement at the highest levels had been pressuring Apple for a solution. “When the FBI filed in San Bernardino, I think many people in the public perceived that as the beginning of something,” said Sewell. “Whereas in reality, it was a long point leading up to that, with a lot of activity that preceded the actual decision by [FBI director James] Comey to file.”

Sewell explained that he, Cook, and other members of Apple’s legal team had been meeting regularly with heads of the FBI, the Justice Department, and the attorney general in both Washington and Cupertino. Cook, Sewell, and others had met not only with James Comey, but also with Attorney General Eric Holder, Attorney General Loretta Lynch, FBI director Bob Mueller (Comey’s predecessor), and Deputy Attorney General Sally Yates.

Cook and Sewell met with Eric Holder and Jim Cole, then the deputy attorney general, in late 2014, and FBI agents told them they were “interested in getting access to phones on a mass basis.” This was way before the attack in San Bernardino, and Apple made it clear from the start that they were not going to grant the FBI access to hack into Apple users’ phones. Cook and Sewell told Holder and Cole that they “didn’t think that that was an appropriate request to be made of a company that has as its primary concern the protection of all citizens.” They had a similar conversation with Lynch and Yates.

Sewell said that during the discussions, it was clear that some law enforcement officials weren’t convinced by the broader social issues. Some were intellectually sympathetic to their position, but as officers of the law, they insisted they needed access to pursue cases. But Sewell said Cook stuck to his position that security and privacy was a cornerstone. Cook was adamant that any attempt to bypass security would be very dangerous. Once a backdoor had been created, it could easily be leaked, stolen, or abused.

But when the San Bernardino case came along, law enforcement saw it as an opportunity to force Apple’s hand. “There was a sense at the FBI level that this is the perfect storm,” said Sewell. “We now have a tragic situation. We have a phone. We have a dead assailant. This is the time that we’re going to push it. And that’s when the FBI decided to file [the writ ordering Apple to create a backdoor].”

As Cook and his team had predicted, the judge’s order ignited a firestorm in the media. The story dominated the news all week and would continue to be headline news for two months. Apple’s response drew strong condemnation from law enforcement, politicians, and pundits, like Democratic senator Dianne Feinstein of California, head of the US Senate Intelligence Committee, who called on Apple to help with the “terrorist attack in my state” and threatened legislation.

At a press conference in Manhattan, William Bratton, New York City police commissioner, also criticized Apple’s policy. He held up a phone involved in a separate investigation of the shooting of two police officers. “Despite having a court order, we cannot access this iPhone,” he told the assembled journalists. “Two of my officers were shot, [and] impeding that case going forward is our inability to get into this device.”

A few days later, Donald Trump, then a presidential candidate, called for a boycott against Apple at a campaign rally in Pawleys Island, South Carolina. Trump even accused Cook of being politically motivated: “Tim Cook is looking to do a big number, probably to show how liberal he is.” Trump was playing to his conservative audience, trying to make Cook seem like a liberal bad guy and using scare tactics to make it seem like Apple was siding with terrorists. He tweeted further attacks on Apple, calling again for a boycott until the company handed over the information to the FBI.

With so many politicians and officials against Apple, the American public lined up against it, too. A Pew survey found that 51 percent of people said Apple should unlock the iPhone to help the FBI, with only 38 percent supporting Cook’s position. But a few days later, another poll by Reuters/ Ipsos came to a different conclusion. According to that poll, 46 percent agreed with Apple’s stance, 35 percent disagreed, and 20 percent didn’t know. The difference was attributed to the phrasing of the question: The Pew survey question gave less information about Apple’s position and appeared to be biased toward the FBI. An analysis of the emojis used in social media came to a similar mixed conclusion. By analyzing positive and negative emojis in people’s tweets (smiley faces, frowns, claps, thumbs up, and thumbs down), a marketing firm called Convince & Convert found a fairly even split between those who sided with Apple and those who supported the FBI. Though this approach was less than scientific, it was clear the public was divided. This experience was unprecedented, and many did not know what to think.

And ultimately, it wasn’t all bad. Cook’s stance also appeared to have some influence on public opinion. In hundreds of responses to Trump’s tweets, lots of citizens defended Apple’s actions. Trump’s tweets tended to bring out contrarian opinions, but most reactions tended toward defenses of Apple. One responder tweeted, “Boycotting Apple products is absurd. Break into one phone, none of us will have privacy. The govt can’t be trusted!!” Several high-profile figures also voiced support for Cook and Apple, including Facebook CEO Mark Zuckerberg, Google CEO Sundar Pichai, Twitter CEO Jack Dorsey, and Edward Snowden, the NSA whistleblower. The New York Times editorial board also weighed in on Apple’s side. In an editorial titled “Why Apple Is Right to Challenge an Order to Help the F.B.I.,” they wrote, “There’s a very good chance that such a law, intended to ease the job of law enforcement, would make private citizens, businesses and the government itself far less secure.” Cook and his team obviously agreed, and hunkered down to continue the fight.

For the next two months, the executive floor at One Infinite Loop turned into a 24/7 situation room, with staffers sending out messages and responding to journalists’ queries. One PR rep said that they were sometimes sending out multiple updates a day with up to 700 journalists cc’d on the emails. This is in stark contrast to Apple’s usual PR strategy, which consists of occasional press releases and routinely ignoring reporters’ calls and emails.

Cook also felt he had to rally the troops, to keep morale high at a time when the company was under attack. In an email to Apple employees, titled “Thank you for your support,” he wrote, “This case is about much more than a single phone or a single investigation.” He continued, “At stake is the data security of hundreds of millions of law-abiding people and setting a dangerous precedent that threatens everyone’s civil liberties.” It worked. Apple employees trusted their leader to make the decision that was right not only for them but also for the general public.

Cook was very concerned about how Apple would be perceived throughout this media firestorm. He wanted very much to use it as an opportunity to educate the public about personal security, privacy, and encryption. “I think a lot of reporters saw a new version, a new face of Apple,” said the PR person, who asked to remain anonymous. “And it was Tim’s decision to act in this fashion. Very different from what we have done in the past. We were sometimes sending out emails to reporters three times a day on keeping them updated.”

Outside Apple’s walls, Cook went on a charm offensive. Eight days after publishing his privacy letter, he sat down for a prime-time interview with ABC News. Sitting in his office at One Infinite Loop, he sincerely explained Apple’s position. It was the “most important [interview] he’s given as Apple’s CEO,” said the Washington Post. “Cook responded to questions with a raw conviction that was even more emphatic than usual,” wrote the paper. “He used sharp and soaring language, calling the request the ‘software equivalent of cancer’ and talking about ‘fundamental’ civil liberties.

He said he was prepared to take the fight all the way to the Supreme Court.” It was clear that Apple’s leader wouldn’t back down from his beliefs, even when things got really tough.

The interview went well, and back at Apple’s HQ, staffers in the war room felt it was a pivotal point. They thought Cook did a great job not only explaining Apple’s point of view but also showing the world that he was a compassionate, ethical leader whom users could trust to maintain their privacy. “This is not a rapacious corporate executive who’s out to make a bunch of money,” said Sewell. “This is somebody who you could trust. Somebody who does what he says he’s going to do. And doesn’t do things that are malicious or that are ill-intentioned but tries to be fair, tries to be a good steward of the company and means what he says and does things that he believes in.”

Apple employees had known this side of Tim Cook for many years, but the public was getting a glimpse for the first time. This was a victory for Apple, since many members of the public did not initially approve of Apple’s decision to keep iPhone information away from the FBI. Apple won another victory at the end of February, when a court in New York rejected an FBI request to order Apple to open the phone of a minor drug dealer. Judge James Orenstein agreed with Apple’s position that the All Writs Act could not be used to order the company to open its products. “The implications of the government’s position are so far-reaching—both in terms of what it would allow today and what it implies about Congressional intent in 1789,” he said.

Although this particular case wasn’t binding on the court in San Bernardino, Sewell said it gave the company much-needed ammunition with the press. “For us it was very, very important,” he said. “It enabled us to then go back to the press and go back to people who had generally been detractors and say, ‘This isn’t about Apple commercialism. This isn’t about Apple being a bad actor. This is a principled position and the only judge in the country that’s looked at this agreed with us.’” Cook and Sewell felt confident that with Judge Orenstein on their side, others would soon be, too.

As the battle raged on, support from privacy advocates grew, but public opinion on Apple’s decision was still largely divided. An NBC survey of 1,000 Americans conducted in March 2016 found that 47 percent of respondents believed the company should not cooperate with the FBI, while 42 percent believed it should. Forty-four percent of respondents said they feared the government would go too far and violate the privacy of its citizens if Apple were to meet its demands.

The United Nations voiced its support for Apple, with special rapporteur David Kaye arguing that encryption is “fundamental to the exercise of freedom of opinion and expression in the digital age.” Kaye continued by stating that the FBI’s “order implicates the security, and thus the freedom of expression, of unknown but likely vast numbers of people, those who rely on secure communications.” But the FBI continued its PR offensive, with then director James Comey telling attendees at a Boston College conference on cybersecurity in March that “there is no place outside of judicial reach. . . . There is no such thing as absolute privacy in America.”

The lowest point for Apple was when Attorney General Loretta Lynch criticized the company during a keynote speech at the security-oriented RSA Conference in San Francisco. Lynch essentially accused Apple of defying the law and the courts. Her comments were widely reported and featured on the evening news. “Nothing could be further from the truth,” Sewell said. “For the attorney general to go on public television and say, ‘Apple is in breach of a court order and is therefore acting unlawfully,’ is inflammatory. . . . A lot of media picked up this as the attorney general saying that Apple is . . . disregarding a court order. But there was no court order.” The judge’s writ requested Apple’s help in the case; it did not compel the company to do so, a distinction that was lost—or ignored—by many critics. Apple wasn’t breaking any laws, and it was determined to fight for user privacy, despite lots of pressure from the government.

Six weeks after the judge filed the motion against Apple, on March 28, Sewell and the legal team flew down to San Bernardino to argue their case before the judge. Cook was preparing to fly down the next day to testify.

But that evening, the FBI backed down, asking the court to indefinitely suspend the proceedings against Apple. The FBI said it had successfully accessed the data stored on the phone, though it didn’t explain how. It was later widely reported that the FBI had gained access to Farook’s iPhone with the help of Israeli phone forensics company Cellebrite, but the company denied its involvement . The identities of the professional hackers who ultimately broke into the phone have yet to become public . At a Senate Judiciary hearing in May, Senator Dianne Feinstein revealed that it had cost the FBI $900,000. Officials had previously admitted that the FBI didn’t find any information they didn’t already have, and no evidence of contacts with ISIS or other supporters. The FBI had to drop the fight with Apple, Sewell explained, because its entire position was that it couldn’t access the iPhone without Apple’s help. When it turned out that they could in fact access the phone, the case collapsed.

Privacy advocates celebrated the end of the case and Apple’s apparent victory. “The FBI’s credibility just hit a new low,” said Evan Greer, campaign director for Fight for the Future, an activist group that promotes online privacy. “They repeatedly lied to the court and the public in pursuit of a dangerous precedent that would have made all of us less safe. Fortunately, internet users mobilized quickly and powerfully to educate the public about the dangers of backdoors, and together we forced the government to back down.”

But Cook was personally disappointed that the case didn’t come to trial. Even though Apple had “won” and wouldn’t be forced to create the backdoor, nothing had really been resolved. “Tim was a little disappointed that we didn’t get a resolution,” said Sewell. He “really felt it would have been fair and it would have been appropriate for us to have tested these theories in court. . . . [Though] the situation that was left at the end of that was not a bad one for us, he would have preferred to go ahead and try the case.” The issue remains unresolved to this day. It could be reawakened at any time, and under the Trump administration it is probably likely to be. It was just another skirmish in the war for privacy and security, and as technology evolves, the battle is likely to erupt again in the future.

From Tim Cook: The Genius Who Took Apple to the Next Level by Leander Kahney, to be published on April 16th by Portfolio, an imprint of Penguin Publishing Group, a division of Penguin Random House, LLC. © 2019

Correction appended, 4/17/2019, 5:00 pm EDT: This story has been updated to correct the spelling of Cellebrite and to clarify that the identity of the hackers who accessed Farook’s iPhone has not been publicly disclosed.

When you buy something using the retail links in our stories, we may earn a small affiliate commission. Read more about how this works.

  • A brief history of porn on the internet
  • How Android fought an epic botnet —and won
  • A fight over specialized chips threatens an Ethereum split
  • Tips for getting the most out of Spotify
  • A tiny guillotine decapitates mosquitoes to fight malaria
  • 👀 Looking for the latest gadgets? Check out our latest buying guides and best deals all year round
  • 📩 Get even more of our inside scoops with our weekly Backchannel newsletter

I Went Undercover as a Secret OnlyFans Chatter. It Wasn’t Pretty

Brendan I. Koerner

Meet the Woman Who Showed President Biden ChatGPT&-and Helped Set the Course for AI

Steven Levy

Indian Voters Are Being Bombarded With Millions of Deepfakes. Political Candidates Approve

Nilesh Christopher

Apple vs. FBI: Here's everything you need to know (FAQ)

zack-whittaker-hs2016-rtsquare-1.jpg

A polarizing legal debate that's engulfed the nation has almost everyone talking.

Should Apple be forced to help the FBI unlock a phone belonging to a terrorist? The arguments are simple enough, but the ramifications and precedent that they set could undermine trust at the foundations of Silicon Valley, one of the largest industries in the world.

NATIONAL SECURITY

In legal showdown, fbi vs. apple could make or break silicon valley.

The FBI scores a game-changing win over tech firms in the ongoing encryption dispute.

US judge Sheri Pym ruled Tuesday that the iPhone and iPad maker must provide a tool that would allow federal agents to beat a security feature preventing the phone from erasing after a number of failed unlocking attempts, according to the AP .

The court ruling did not order Apple to break the encryption, but said it should offer "reasonable technical assistance" to law enforcement.

The iPhone 5c was a work phone used by Syed Farook, who along with his wife, Tashfeen Malik, murdered 14 people in San Bernardino, California in December 2015.

Federal agents don't know the passcode to the phone, and run the risk of erasing all the data. But Apple doesn't have access to the passcode either. The company began locking itself out of the security chain to prevent law enforcement from demanding that it hands them over.

Apple's bid to shut itself out of the encryption loop was precisely to avoid the kind of ethical dilemma that would force it into handing over customer data to the authorities.

More than 94 percent of all iPhones and iPads, which run iOS 8 or later, can be encrypted.

Apple chief executive Tim Cook said in an open letter hours after the ruling that it "opposes" the order because it has "implications far beyond the legal case at hand."

Simply put: if Apple can be forced to hack one iPhone, where will it end?

The case is ever-changing and developing over time. We've collated as many questions as we can, and will update over the next few hours. If you have a specific question, send an email , or leave a comment below.

Here's what you need to know.

What is Apple specifically being asked to do?

Apple can't break the encryption on the iPhone (or its other products), so he FBI has instead asked the company to disable certain features that would help its agents to unlock the iPhone.

The FBI wants to create a special version of the iPhone's software that only works on the recovered device. Apple has to sign it with its secret keys in order to install it on the subject's iPhone. This custom version will "bypass or disable the auto-erase function" so it will not wipe the phone after a number of failed passcode guesses.

Apple must also modify the software on the subject's iPhone will not "purposefully introduce any additional delay between passcode attempts beyond what is incurred by Apple hardware." That's currently about 80 milliseconds. That limits the FBI to about 12 passcode guesses each second . Farook reportedly used a four-digit passcode, says the BBC , which could take just minutes to crack. Instead of forcing someone to type in passcodes manually, Apple must "enable the FBI to submit passcodes" to the subject's iPhone through an FBI device.

The FBI will ship the iPhone to Apple, so that the company's proprietary code or secret keys never leaves the campus.

What kind of iPhone is subject to this order?

Farook's phone was an iPhone 5c, running the latest version of the mobile software, iOS 9. The phone belonged to the county he worked for, San Bernardino Dept. of Public Health, which has given the government permission to search the phone.

The problem is, because the phone is encrypted, it can't.

What is the legal basis for the FBI's court order? What law was used?

Apple is essentially being forced to punch a hole in the security of its own product.

The judge invoked a little-know law dating back almost 230 years. The All Writs Act is designed to gives a court the "authority to issue [orders] that are not otherwise covered by statute," so long as the request is not impossible.

OLD LAW, NEW TRICKS

Apple tells judge 200-year-old law can't unlock iphones.

But the company said it has the "technical capability" to extract data in one-in-ten iPhones.

A court forcing Apple to reverse its encryption would be "substantially burdensome," but asking it to remove the feature that prevents the phone from erasing after ten failed passcode attempts is not.

The government invoking All Writs Act could set, in Cook's words, a "dangerous precedent" down the line. That's because "coding is not burdensome," the government says , according to Andrew Crocker, a staff attorney at the Electronic Frontier Foundation.

"The scope of authority under the [All Writs Act] is just very unclear as applied to the Apple case.," said Orin Kerr, professor of law, in the Washington Post . "This case is like a crazy-hard law school exam hypothetical in which a professor gives students an unanswerable problem just to see how they do."

Kerr has an unprecedented insight on the case. You can read more here .

Surely the NSA can crack the iPhone. Why hasn't it? Is there some alternative motive behind this legal move?

Some believe that the National Security Agency (NSA) can probably crack the iPhone. The agency, embroiled in mass surveillance programs in recent years, has reportedly hacked into companies' networks to steal secret codes in order for its spies to get access to people's phone calls, messages, and even their smartphones .

What's stopping the NSA from stealing Apple's secret codes that would help the FBI get access to the phone? It may have done so already -- it's already hypothesized by some .

Apple said the FBI's demands will set a "dangerous precedent." That's the key: the argument is that the FBI could do this itself if it really wanted to, but the government is "desperate to establish" the legal case , said Christopher Soghoian, principal technologist at the American Civil Liberties Union.

The ramifications and precedent that they set could undermine trust at the foundations of Silicon Valley, hamper growth, and force foreign companies to look elsewhere.

Is Apple being asked to bypass or break the iPhone's encryption?

It comes down to semantics. Technically, no, there has been at no point any suggestion that Apple's use of encryption or the crypto it uses is in any way insecure.

The court order does not demand Apple bypass the encryption because Apple can't. But, it has been asked to remove a feature that would allow the FBI to carry out as many passcode entries as it wants. But the fact that FBI can forcibly enter as many passcodes as it wants could be considered a significant flaw in the security.

How does the iPhone's passcode-protected encryption work?

It's relatively simple: If you have a passcode on your iPhone running iOS 8 or later, the contents of your phone are scrambled. When you enter your four or six-digit passcode , it immediately unlocks your phone.

The passcode is coupled with a key that's embedded in the phone's hardware called the "secure enclave." Because it's part of the actual hardware, it can't be modified.

Security researcher Dan Guido, who has been extensively cited on this case, explained this in a bit more detail on his blog :

"When you enter a passcode on your iOS device, this passcode is 'tangled' with a key embedded in the [secure enclave] to unlock the phone. Think of this like the 2-key system used to launch a nuclear weapon: the passcode alone gets you nowhere. Therefore, you must cooperate with the secure enclave to break the encryption. The secure enclave keeps its own counter of incorrect passcode attempts and gets slower and slower at responding with each failed attempt, all the way up to 1 hour between requests."

He said that even a customized version of iOS "cannot influence the behavior of the Secure Enclave," meaning any iPhone that has a secure enclave can't just be modified by Apple.

The FBI wants to unlock an iPhone 5c, which doesn't have a "secure enclave." Can Apple comply with this court order?

It's said that the FBI's requests are "technically feasible" in this case. That's because Apple is able to modify the iPhone's software to remove the security features.

Guido noted on his blog :

"On the iPhone 5C, the passcode delay and device erasure are implemented in software and Apple can add support for peripheral devices that facilitate PIN code entry. In order to limit the risk of abuse, Apple can lock the customized version of iOS to only work on the specific recovered iPhone and perform all recovery on their own, without sharing the firmware image with the FBI."

Apple has not said if it has no technical means not to comply.

What about other iPhones? Is it possible to unlock other, newer iPhones?

A senior Apple executive speaking to the media on background (reporters were not asked to name executives or quote them directly) said Apple is fighting for all its iPhones, not just the terrorist's phone.

"The custom software tool the FBI has ordered it to develop in order to crack into a dead terrorist's iPhone 5c would be effective on every type of iPhone currently being sold," reports Motherboard , one of the news outlets on the call.

Apple executives said that the request was "unduly burdensome" -- its main argument against carrying out the order -- and that it could take weeks or months to carry out.

It's worth noting that Apple can bypass the passcode on devices running software prior to iOS 8, with or without a court order.

If this sets a legal precedent, other companies could be forced to perform similar actions. Who else in the tech industry supports Apple?

At first, Silicon Valley was muted. It wasn't clear why. Some were worried they might make themselves targets, or lose government contracts down the line.

Sundar Pichai, chief executive of Google called for in a series of tweets on Wednesday "a thoughtful and open discussion on this important issue." Pichai fell short of demanding an end to the FBI's offensive, but did say that hacking of devices could set a "troubling precedent."

Some saw it as a voice of support, whereas others thought it was a weak statement.

Jan Koum, chief executive of WhatsApp, published a post on Facebook (which owns WhatsApp) in support of Apple's stance. "We must not allow this dangerous precedent to be set. Today our freedom and our liberty is at stake," he said.

Twitter boss Jack Dorsey said on Twitter that he supported Cook's decision, tweeting: "We stand with @tim_cook and Apple (and thank him for his leadership)!"

Firefox browser maker Mozilla also lent its support , as did billionaire investor Mark Cuban .

Other companies associated with the Reform Government Surveillance coalition, which includes Microsoft and Yahoo -- two firms also implicated by the PRISM surveillance program -- offered tepid support .

"RGS companies remain committed to providing law enforcement with the help it needs while protecting the security of their customers and their customers' information," the statement read .

Republican presidential nominee frontrunner Donald Trump called for "common sense" to prevail and for Apple to work with the FBI. Trump said he "100 percent" agreed with the courts. "But to think that Apple won't allow us to get into her cell phone, who do they think they are? No, we have to open it up," he said.

No presidential candidate has yet endorsed or spoken out in favor of Apple's move.

The FBI says it's not impossible, and the court has issued an order. So why is Apple refusing to comply with the court order?

Cook said in an open letter published on Apple's website that the court's demands "would undeniably create a backdoor" for the FBI.

Apple argues that introducing a backdoor into the iPhone wouldn't just make Farook's phone insecure, it would make every iPhone weaker. As pointed out by The Guardian , the argument that Apple is somehow "helping" the terrorists isn't fair. Because encryption (and other technologies) are inherently agonistic, Apple cannot pick and choose who it protects. Either it mandates privacy for everyone, or no-one.

Cook said the FBI had "asked us for something we simply do not have, and something we consider too dangerous to create." It would be opening Pandora's box of security.

Why did Apple begin to roll out passcode-protected encryption in the first place?

Some argue it was the US government's fault that sparked Apple to begin encrypting its devices in the first place.

The move to add encryption was in part a response to accusations that the company was complicit in the PRISM surveillance program, leaked by whistleblower Edward Snowden, a claim the company strenuously denies. Apple aimed to show this by setting itself apart from the rest of the crowd by bolstering its encryption efforts in such a way that makes it impossible for it to decrypt the data.

Cook said in an interview with PBS' Charlie Rose at the time that if the government laid a warrant at its door, "We don't have a key. The door is closed."

Apple announced it switched on encryption the day iOS 8, released in September 2014, was released, likely to preempt any government pushback.

Edward Snowden, said in a tweet following the court ruling, said the FBI was "creating a world where citizens rely on Apple to defend their rights, rather than the other way around".

What's stopping other countries and repressive regimes, like Russia and China, making similar demands?

The US won't be the only country wanting this power. If the US can have it, why can't Russia, or China, or any other major global powerhouse? Because Apple is headquartered in the US, it has to abide by US law. But it has to also adhere to every law it operates in. That can get tricky very quickly.

Sen. Ron Wyden (D-OR), a member of the Senate Intelligence Committee and staunch privacy advocate, said the move could easily "snowball" around the world. "Why in the world would our government want to give repressive regimes in Russia and China a blueprint for forcing American companies to create a backdoor?" he added.

China could impose rules forcing Apple to hand over encryption keys -- or some backdoor technology that the US has demanded -- or it could stop the company from operating in China. That could be a massive blow to the company, where its mainland China revenue accounts for almost half of its global revenue , as of its first fiscal quarter.

Apple told reporters that "no other country in the world has asked them to do what DOJ seeks."

But it's not just oppressive nations. The UK has a draft surveillance bill in its parliament, which if it passes, could demand the same "secret backdoors" that the FBI sought. (Vice's Motherboard has more on this .)

Can I read the court order and the DOJ's 40-page request for myself?

Sister-site CNET posted the two documents. You can find the three-page court order here and the Justice Dept.'s request from February 16 here .

Can Apple appeal this case?

Apple has until February 26 to respond to the court order. A hearing is expected on March 22, according to Reuters . If Apple were to challenge the order (which is expected), it will appeal to the Ninth Circuit appeals court.

It's possible this case may go all the way to the Supreme Court, but only if the government "loses big" at the appeal's court, said Nate Carozo, staff attorney at the EFF, said in a tweet .

One of the longest-lasting laptops I've tested is not a MacBook or Asus

The most immersive speaker system i've ever tested is now $110 off at best buy, why apple's elusive ipad pro amazon discount is a fantastic deal for memorial day.

  • Opportunities
  • Free Speech
  • Creativity and Innovation
  • Transparency
  • International
  • Deeplinks Blog
  • Press Releases
  • Legal Cases
  • Whitepapers
  • Annual Reports
  • Action Center
  • Electronic Frontier Alliance
  • Privacy Badger
  • Surveillance Self-Defense
  • Atlas of Surveillance
  • Cover Your Tracks
  • Crocodile Hunter
  • Street Level Surveillance
  • Donate to EFF
  • Giving Societies
  • Other Ways to Give
  • Membership FAQ

Search form

  • Copyright (CC BY)
  • Privacy Policy

apple backdoor case study

Apple's Plan to "Think Different" About Encryption Opens a Backdoor to Your Private Life

Apple logo with crossed keys

Apple has announced impending changes to its operating systems that include new “protections for children” features in iCloud and iMessage. If you’ve spent any time following the Crypto Wars , you know what this means: Apple is planning to build a backdoor into its data storage system and its messaging system.

Child exploitation is a serious problem, and Apple isn't the first tech company to bend its privacy-protective stance in an attempt to combat it. But that choice will come at a high price for overall user privacy. Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.

JOIN THE NATIONWIDE PROTEST

TELL APPLE: DON'T SCAN OUR PHONES

To say that we are disappointed by Apple’s plans is an understatement. Apple has historically been a champion of end-to-end encryption, for all of the same reasons that EFF has articulated time and time again . Apple’s compromise on end-to-end encryption may appease government agencies in the U.S. and abroad, but it is a shocking about-face for users who have relied on the company’s leadership in privacy and security.

There are two main features that the company is planning to install in every Apple device. One is a scanning feature that will scan all photos as they get uploaded into iCloud Photos to see if they match a photo in the database of known child sexual abuse material (CSAM) maintained by the National Center for Missing & Exploited Children (NCMEC). The other feature scans all iMessage images sent or received by child accounts—that is, accounts designated as owned by a minor—for sexually explicit material, and if the child is young enough, notifies the parent when these images are sent or received. This feature can be turned on or off by parents.

When Apple releases these “client-side scanning” functionalities, users of iCloud Photos, child users of iMessage, and anyone who talks to a minor through iMessage will have to carefully consider their privacy and security priorities in light of the changes, and possibly be unable to safely use what until this development is one of the preeminent encrypted messengers.

Apple Is Opening the Door to Broader Abuses

We’ve said it before , and we’ll say it again now: it’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses.

That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change.

All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts. That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change. Take the example of India, where recently passed rules include dangerous requirements for platforms to identify the origins of messages and pre-screen content. New laws in Ethiopia requiring content takedowns of “misinformation” in 24 hours may apply to messaging services. And many other countries—often those with authoritarian governments—have passed similar laws . Apple’s changes would enable such screening, takedown, and reporting in its end-to-end messaging. The abuse cases are easy to imagine: governments that outlaw homosexuality might require the classifier to be trained to restrict apparent LGBTQ+ content, or an authoritarian regime might demand the classifier be able to spot popular satirical images or protest flyers.

We’ve already seen this mission creep in action. One of the technologies originally built to scan and hash child sexual abuse imagery has been repurposed to create a database of “terrorist” content that companies can contribute to and access for the purpose of banning such content. The database, managed by the Global Internet Forum to Counter Terrorism (GIFCT), is troublingly without external oversight, despite calls from civil society . While it’s therefore impossible to know whether the database has overreached, we do know that platforms regularly flag critical content as “terrorism,” including documentation of violence and repression, counterspeech, art, and satire.

Image Scanning on iCloud Photos: A Decrease in Privacy

Apple’s plan for scanning photos that get uploaded into iCloud Photos is similar in some ways to Microsoft’s PhotoDNA . The main product difference is that Apple’s scanning will happen on-device. The (unauditable) database of processed CSAM images will be distributed in the operating system (OS), the processed images transformed so that users cannot see what the image is, and matching done on those transformed images using private set intersection where the device will not know whether a match has been found. This means that when the features are rolled out, a version of the NCMEC CSAM database will be uploaded onto every single iPhone. The result of the matching will be sent up to Apple, but Apple can only tell that matches were found once a sufficient number of photos have matched a preset threshold.

Once a certain number of photos are detected, the photos in question will be sent to human reviewers within Apple, who determine that the photos are in fact part of the CSAM database. If confirmed by the human reviewer, those photos will be sent to NCMEC, and the user’s account disabled. Again, the bottom line here is that whatever privacy and security aspects are in the technical details, all photos uploaded to iCloud will be scanned.

Make no mistake: this is a decrease in privacy for all iCloud Photos users, not an improvement.

Currently, although Apple holds the keys to view Photos stored in iCloud Photos, it does not scan these images. Civil liberties organizations have asked the company to remove its ability to do so. But Apple is choosing the opposite approach and giving itself more knowledge of users’ content.

Machine Learning and Parental Notifications in iMessage: A Shift Away From Strong Encryption

Apple’s second main new feature is two kinds of notifications based on scanning photos sent or received by iMessage. To implement these notifications, Apple will be rolling out an on-device machine learning classifier designed to detect “sexually explicit images.” According to Apple, these features will be limited (at launch) to U.S. users under 18 who have been enrolled in a Family Account . In these new processes, if an account held by a child under 13 wishes to send an image that the on-device machine learning classifier determines is a sexually explicit image, a notification will pop up, telling the under-13 child that their parent will be notified of this content. If the under-13 child still chooses to send the content, they have to accept that the “parent” will be notified, and the image will be irrevocably saved to the parental controls section of their phone for the parent to view later. For users between the ages of 13 and 17, a similar warning notification will pop up, though without the parental notification.

Similarly, if the under-13 child receives an image that iMessage deems to be “sexually explicit”, before being allowed to view the photo, a notification will pop up that tells the under-13 child that their parent will be notified that they are receiving a sexually explicit image. Again, if the under-13 user accepts the image, the parent is notified and the image is saved to the phone. Users between 13 and 17 years old will similarly receive a warning notification, but a notification about this action will not be sent to their parent’s device.

This means that if—for instance—a minor using an iPhone without these features turned on sends a photo to another minor who does have the features enabled, they do not receive a notification that iMessage considers their image to be “explicit” or that the recipient’s parent will be notified. The recipient’s parents will be informed of the content without the sender consenting to their involvement. Additionally, once sent or received, the “sexually explicit image” cannot be deleted from the under-13 user’s device.

Whether sending or receiving such content, the under-13 user has the option to decline without the parent being notified. Nevertheless, these notifications give the sense that Apple is watching over the user’s shoulder—and in the case of under-13s, that’s essentially what Apple has given parents the ability to do.

These notifications give the sense that Apple is watching over the user’s shoulder—and in the case of under-13s, that’s essentially what Apple has given parents the ability to do.

It is also important to note that Apple has chosen to use the notoriously difficult-to-audit technology of machine learning classifiers to determine what constitutes a sexually explicit image. We know from years of documentation and research that machine-learning technologies, used without human oversight, have a habit of wrongfully classifying content, including supposedly “sexually explicit” content. When blogging platform Tumblr instituted a filter for sexual content in 2018, it famously caught all sorts of other imagery in the net, including pictures of Pomeranian puppies, selfies of fully-clothed individuals, and more. Facebook’s attempts to police nudity have resulted in the removal of pictures of famous statues such as Copenhagen’s Little Mermaid . These filters have a history of chilling expression, and there’s plenty of reason to believe that Apple’s will do the same.

Since the detection of a “sexually explicit image” will be using on-device machine learning to scan the contents of messages, Apple will no longer be able to honestly call iMessage “end-to-end encrypted.” Apple and its proponents may argue that scanning before or after a message is encrypted or decrypted keeps the “end-to-end” promise intact, but that would be semantic maneuvering to cover up a tectonic shift in the company’s stance toward strong encryption.

Whatever Apple Calls It, It’s No Longer Secure Messaging

As a reminder, a secure messaging system is a system where no one but the user and their intended recipients can read the messages or otherwise analyze their contents to infer what they are talking about. Despite messages passing through a server, an end-to-end encrypted message will not allow the server to know the contents of a message. When that same server has a channel for revealing information about the contents of a significant portion of messages, that’s not end-to-end encryption. In this case, while Apple will never see the images sent or received by the user, it has still created the classifier that scans the images that would provide the notifications to the parent. Therefore, it would now be possible for Apple to add new training data to the classifier sent to users’ devices or send notifications to a wider audience, easily censoring and chilling speech.

But even without such expansions, this system will give parents who do not have the best interests of their children in mind one more way to monitor and control them, limiting the internet’s potential for expanding the world of those whose lives would otherwise be restricted. And because family sharing plans may be organized by abusive partners, it's not a stretch to imagine using this feature as a form of stalkerware .

People have the right to communicate privately without backdoors or censorship, including when those people are minors. Apple should make the right decision: keep these backdoors off of users’ devices.

JOIN THE NATIONWIDE   PROTEST

Read further on this topic: 

  • If You Build It, They Will Come: Apple Has Opened the Backdoor to Increased Surveillance and Censorship Around the World 
  • How LGBTQ Content is Censored Under the Guise of "Sexually Explicit"
  • EFF Joins Global Coalition Asking Apple CEO Tim Cook to Stop Phone-Scanning
  • Apple’s Plan to Scan Photos in Messages Turns Young People Into Privacy Pawns
  • 25,000 EFF Supporters Have Told Apple Not To Scan Their Phones
  • Delays Aren't Good Enough—Apple Must Abandon Its Surveillance Plans 
  • Don’t Stop Now: Join EFF, Fight for the Future at Apple Protests Nationwide
  • Protestors Nationwide Rally to Tell Apple: "Don't Break Your Promise!"
  • Why EFF Flew a Plane Over Apple's Headquarters

Related Issues

Join eff lists, discover more., related updates.

apple backdoor case study

Ola Bini Faces Ecuadorian Prosecutors Seeking to Overturn Acquittal of Cybercrime Charge

Ola Bini, the software developer acquitted last year of cybercrime charges in a unanimous verdict in Ecuador, was back in court last week in Quito as prosecutors, using the same evidence that helped clear him, asked an appeals court to overturn the decision with bogus allegations of unauthorized access...

A multi-colored bullhorn icon surrounded by grey-blue hexagons

Four Voices You Should Hear this International Women’s Day

Around the globe, freedom of expression varies wildly in definition, scope, and level of access. The impact of the digital age on perceptions and censorship of speech has been felt across the political spectrum on a worldwide scale. In the debate over what counts as free expression and how it...

crossed keys security icon banner

Four Infosec Tools for Resistance this International Women’s Day 

While online violence is alarmingly common globally, women are often more likely to be the target of mass online attacks, nonconsensual leaks of sensitive information and content, and other forms of online violence. This International Women’s Day, visit EFF’s Surveillance Self-Defense (SSD) to learn how to defend yourself and...

hands with circuit patterns on black background

Protect Good Faith Security Research Globally in Proposed UN Cybercrime Treaty

Statement submitted to the UN Ad Hoc Committee Secretariat by the Electronic Frontier Foundation, accredited under operative paragraph No. 9 of UN General Assembly Resolution 75/282, on behalf of 124 signatories. We, the undersigned, representing a broad spectrum of the global security research community, write to express our serious concerns...

UN Cybercrime Treaty - Civil Society Letter

Draft UN Cybercrime Treaty Could Make Security Research a Crime, Leading 124 Experts to Call on UN Delegates to Fix Flawed Provisions that Weaken Everyone’s Security

Security researchers’ work discovering and reporting vulnerabilities in software , firmware, networks , and device s protects people, businesses and governments around the world from malware, theft of critical data, and other cyberattacks. The internet and the digital ecosystem are safer because of their work.The UN Cybercrime Treaty,...

A robot poses as The Thinker by Rodin

Worried About AI Voice Clone Scams? Create a Family Password

Your grandfather receives a call late at night from a person pretending to be you. The caller says that you are in jail or have been kidnapped and that they need money urgently to get you out of trouble. Perhaps they then bring on a fake police officer or kidnapper...

Security issues banner, a colorful graphic of two barrel keys forming an X

In Final Talks on Proposed UN Cybercrime Treaty, EFF Calls on Delegates to Incorporate Protections Against Spying and Restrict Overcriminalization or Reject Convention

Update: Delegates at the concluding negotiating session failed to reach consensus on human rights protections, government surveillance, and other key issues. The session was suspended Feb. 8 without a final draft text. Delegates will resume talks at a later day with a view to concluding their work and providing a ...

Icons for activism, technology, and litigation with colorful, messy wires

EFF’s 2024 In/Out List

Sketchy and dangerous android children’s tablets and tv set-top boxes: 2023 in review.

digital icons interconnected

Spritely and Veilid: Exciting Projects Building the Peer-to-Peer Web

While there is a surge in federated social media sites, like Bluesky and Mastodon, some technologists are hoping to take things further than this model of decentralization with fully peer-to-peer applications. Two leading projects, Spritely and Veilid, hint at what this could look like.There are many technologies used behind the...

Back to top

Follow EFF:

Check out our 4-star rating on Charity Navigator .

  • Internships
  • Diversity & Inclusion
  • Creativity & Innovation
  • EFFector Newsletter
  • Press Contact
  • Join or Renew Membership Online
  • One-Time Donation Online

apple backdoor case study

HBR.ORG - Prod

  • Business Case Studies

Leadership & Managing People

Building a "Backdoor" to the iPhone: An Ethical Dilemma ^ W16245

Building a "Backdoor" to the iPhone: An Ethical Dilemma

apple backdoor case study

Building a "Backdoor" to the iPhone: An Ethical Dilemma ^ W16245

Want to buy more than 1 copy? Contact: [email protected]

Product Description

Publication Date: April 28, 2016

Source: Ivey Publishing

In February 2016, Tim Cook, Apple's chief executive officer, challenged a U.S. Federal Court order for Apple to assist the Federal Bureau of Investigation (FBI) in a case involving suspected international terrorism. The government wanted Apple to provide the FBI with access to encrypted data on an Apple product, the iPhone. Cook's refusal to acquiesce to the government's demands drew strong public debate, pitting the proponents of national security against those in favour of customers' digital privacy and security. The case invoked an ethical dilemma faced by management in issues involving right-versus-right decisions. Which right should Cook choose? What are the ethical dilemmas involved in making this decision? How should Cook resolve the dilemma? Tulsi Jayakumar is affiliated with SP Jain Institute of Management & Research.

apple backdoor case study

This Product Also Appears In

Buy together, related products.

Lenhage AG: Ethical Dilemma ^ W14137

Lenhage AG: Ethical Dilemma

Building an Ethical Career ^ R2001L

Building an Ethical Career

Building an Ethical Company ^ R2106K

Building an Ethical Company

Copyright permissions.

If you'd like to share this PDF, you can purchase copyright permissions by increasing the quantity.

Order for your team and save!

Who’s Right In Apple’s Fight with the FBI?

A U.S. magistrate judge has ordered Apple to help the FBI break into an iPhone used by one of the gunmen in the mass shooting in San Bernardino, Calif.

A U.S. magistrate judge has ordered Apple to help the FBI break into an iPhone used by one of the gunmen in the mass shooting in San Bernardino, Calif. (AP Photo/Carolyn Kaster)

The legal standoff playing out between Apple and the FBI has reignited the debate over privacy and national security.

If you haven’t followed the fight, here are the highlights: In December, the FBI seized the iPhone of one of the two suspects behind the shooting in San Bernardino, Calif., an attack that left 14 people dead. However, encryption technology is blocking the government from accessing the phone’s contents. A federal magistrate judge has ordered Apple to write a custom version of its software that would help investigators unlock the phone, but CEO Tim Cook is balking.

“We feel we must speak up in the face of what we see as an overreach by the U.S. government,” wrote Cook in  an open letter to customers . “Ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.”

The stance has its roots in the backlash faced by Silicon Valley in the wake of the Snowden revelations . But could valuable intelligence be locked on that phone? One of the shooters, after all, had previously expressed support for ISIS. Might unlocking the device yield valuable intelligence?

For more on the case’s implications for privacy and national security, we invited two experts to join us for a debate.  James Andrew Lewis , director of the strategic technologies program at the Center for Strategic and International Studies, believes Apple should comply with the order. Nate Cardozo , a staff attorney for the Electronic Frontier Foundation, disagrees. Here’s what they had to say:

Dear James,

Earlier this week, a U.S. federal magistrate judge issued an unprecedented order to Apple, wholly adopting the FBI’s position without engaging in any independent legal analysis. And that’s a shame, because the FBI didn’t apply for this order as an end in itself; they did it to create precedent.

In isolation, the order might seem reasonable. The FBI says it needs access to a suspect’s phone, and it appears that Apple is capable of creating a custom software package, a literal master key, that would let anyone break into an iPhone they had in their possession. The government has assured the court that it only wants into that one iPhone, and that Apple’s master key would be programed to unlock just that one device. “They are simply asking for something that would have an impact on this one device,” echoed White House Press Secretary Josh Earnest.

But that explanation is hogwash. It’s not about this one iPhone, this one attack, or this one investigation. The FBI is asking the court to create a rule going forward that would permit it to obtain orders in future cases requiring companies to create backdoors in anything the FBI feels it needs. This is not a power that the FBI has ever had in the physical world; to the best of our knowledge, no court ever ordered Brinks to make a master key to every safe.

The government chose to have this fight on this particular case, with these particular facts, very carefully. The crime at issue was absolutely heinous and the nation’s sympathies are rightly with the victims and their families. The phone at issue is the suspect’s work phone, owned by the county which has, of course, consented to the FBI’s plan. The FBI was confident that Apple would be technically able to provide it with the master key.

But if the legal theory the government is using here has the power to force Apple to subvert its security infrastructure in this case, there is no obvious limiting principle. Once that master key is ordered created, we can be certain that our government will ask for it again and again, for other phones, from other companies. Nothing would prevent the FBI from turning this power — potentially in secret — against any software or device that has the audacity to offer strong security. What’s worse, once the FBI has the authority to force American companies to subvert the security of their own products, those companies will be unable to resist demands from other governments. Apple has successfully resisted Chinese and Russian demands for a backdoor only because they’re able to argue that it wouldn’t do so for any government. If the FBI wins here, we’ll all lose.

No American court has ever ordered anything like this before. This is not a door we want opened.

This isn’t really a discussion about encryption. It’s a discussion about whether people trust their government. If you don’t trust the government, and many don’t, then you’ll like Apple’s narrow-minded refusal to help.

There is growing distrust of governments around the world, not just in the United States, but in the U.S., distrust is accelerated by a pervasive and worrisome narrative about corrupt officials and agencies run amok — as if “House of Cards” was real and not some screenwriters’ fantasy.

But there’s a fundamental flaw in the logic of those with such deep distrust of government. If government is made weaker, they will not be safer nor will they have more freedom. In fact, countries with weak governments are places where the average citizen has fewer rights and less safety.

If Apple had been smart, it would have quietly cooperated with the FBI’s request. But Apple has a problem, and that problem is distrust in the global market for American products brought on by the Snowden revelations. Apple must show its global customer base that American agencies do not have easy access to their data, and the company’s position over the San Bernardino case is intended mainly to reassure those markets.

Frankly, I don’t think this will work. Besides, it is not just the FBI that wants in. China, France, India, Russia and the U.K. all want recoverable encryption and are putting in place laws that require it. What happens in the U.S. may ultimately be irrelevant. The tide is turning against Apple.

There is a way to provide strong encryption that is “law enforcement friendly” that doesn’t involve any back door, but the encryption debate has been too trivial to get there yet.

In this case, the San Bernardino Department of Public Health (it’s a work phone, they own it) has provided consent to the FBI for a search and has asked Apple for it to unlock it. Apple has the technical means to gain access, and the court order requests a technical solution to this specific device. The best solution would have been for Apple to help out and take the credit. That opportunity is gone. Apple should comply with the warrant. It won’t hurt their foreign market and it might make the rest of us a little safer.

James —

First, a correction on the technology involved here: there is no such thing as “law enforcement friendly” strong encryption. It just doesn’t exist. There is no way to keep our data secure from identity thieves, criminals, spies, corporate espionage, stalkers, or any of the myriad bad actors who want to get into our devices, while at the same time giving American law enforcement the access they’re demanding.

It used to be that when an iPhone was stolen, any run-of-the-mill criminal could break into the phone and read the owner’s secrets. Apple responded to that threat by hardening the devices it sells. It would be great if it was possible for Apple to keep us secure from the bad guys, while at the same time letting the good guys in when they need access. But there simply is no compromise position.

Don’t take my word for it. Last summer, an all-star group of experts published a paper titled “ Keys Under Doormats ” that utterly refuted the possibility that a so-called “exceptional access” regime, such as the one you’ve endorsed, could keep us secure. The group, which included the inventors of modern cryptography, computer scientists, mathematicians, and engineers from MIT, Columbia, Microsoft, Penn, Johns Hopkins, SRI, Worcester Polytechnic and Harvard, concluded that what the FBI is asking for is “unworkable in practice … and would undo progress on security at a time when Internet vulnerabilities are causing extreme economic harm.”

This “debate” over strong encryption is eerily similar to the “debate” over climate change. On one side, there are entrenched political elements, dogmatically advocating for their position. And on the other is a literally unanimous chorus of scientists, telling politicians that they’re wrong. And the position of scientists who’ve looked at the FBI’s proposal is unanimous. So far as I know, there is not a single cryptographer, mathematician or computer scientist who has published anything contesting the conclusion of the “Keys Under Doormats” paper. It is simply impossible to do what you’re asking Apple to do without endangering us all.

But as you say, we live in a world where repressive governments like China and Russia are salivating at the prospect of “recoverable encryption.” But is that fact any reason at all for our government to force an American company to give those governments the tools of repression? I’m of the firm opinion that we, as a society, should not stoop to that level.

I fully agree with you — it would be great if Apple could comply with the FBI’s request without endangering ordinary Americans. But that’s a fantasy.

Nate —

Let’s get a little context. This isn’t about privacy. You don’t have any privacy. There are more than a dozen tracking programs on any website you visit. Companies take your data and commercialize it. This is why companies want you to log in before using a service or buying from their app store — so they can associate your actions with a profile they’ve collected about you and will sell. You have as much privacy as a goldfish in a bowl. It’s fair to ask what the privacy watchdogs were protecting while all this happened. Talk about the dogs that didn’t bark. Big companies, big intelligence agencies and the occasional random hacker group all have access to personal data — it’s for sale in online black markets.

Apple is doing this to protect its foreign markets, but refusing a court order will only slow the damage. Most countries use communications surveillance for domestic security (and to spy on tourists), and most citizens of foreign countries don’t object to surveillance by their own governments. They object to surveillance by the American government and by giant American Internet companies, including Apple. When Angela Merkel said she didn’t want to be a “data colony” of the United States, she wasn’t talking about the FBI.

This would have happened even without Snowden. The rest of the world wants an Internet that meets their preferences, not those of Silicon Valley. You’ve seen a whole string of actions — the European Court of Justice’s decision to cancel the 2000 Safe Harbor agreement, the “right to be forgotten” requirements imposed on Google, laws requiring data localization, all in reaction to the privacy pillaging Internet business model. Most countries have — or are moving to — a requirement that encryption be recoverable when a court order is served. It will be interesting to see how Apple responds when they get a similar request from the Chinese government.

There is real risk. There has been a major terrorist incident attempted against the U.S. every year since 2001. Apple isn’t protecting us from these things, nor does ISIS care about your privacy. My guess is that life will get harder for American tech companies if they refuse to comply.

A final note. This is a law enforcement problem. The FBI needs a clean chain of custody so that evidence can be used in court. It’s not an intelligence problem. Yes, this kind of encryption makes the work of intelligence agencies like Russia’s FSB or the NSA harder and more expensive, but not impossible. If a spy agency wants in, they will get in. The tricks an intelligence agency uses to break into your phone are not the tricks that produce court-worthy evidence, however.

There is a way to let people use strong encryption that can only be accessed with permission, either from the owner or from a court — products like those used by Google. Their encryption is very difficult to crack, but Google can recover the plain text — they need it for advertising and data analytics. This recoverable encryption is what companies use and what most people want.

The Apple case is the third time I’ve seen this movie. In the early 1990s, there was a fight to make phone companies build in surveillance into their switches. The result was the Communications Assistance to Law Enforcement Act. In the late 1990s, in the crypto wars (I was deeply involved), the U.S. decided then that it was better for Americans to have access to strong encryption to protect themselves online. That’s still the right decision, but it is no longer 1999 and the delusion that war had ended and every country would be a democracy doesn’t describe the world we live in today. Every time technology changes, the law has to change with it. The Internet is changing, the danger to public safety has changed, and encryption policy needs to change with it.

Privacy nihilism is seductive, but deeply misguided. Privacy is not dead, and only those who wish to kill it claim otherwise.

As you well know, privacy advocates, including me and my colleagues at the Electronic Frontier Foundation, are fighting just as hard against corporate data collection as we are against illegal government surveillance. We developed a browser plug-in called Privacy Badger that blocks trackers based on observed behavior, so we don’t need to rely on a hard-to-maintain blacklist. In December of last year, I filed a formal complaint against Google with the Federal Trade Commission, detailing how they’re misleading the public by illegally tracking our students’ classroom behavior despite repeated claims to the contrary. And those are just two examples of how we’re fighting back against the surveillance of ordinary people worldwide. Finally, just to point out the obvious that while Google and Facebook are insidious trackers of our online behavior, they can’t throw you in jail. They’re not even legally permitted to turn over the content of your communications to anyone without a warrant based on probable cause.

Privacy is a prerequisite to democracy. That you have nothing to hide is irrelevant; it’s not about you, or about me. Just as freedom of speech benefits those with nothing to say, privacy benefits those with nothing to hide. Without privacy, social change is impossible. The civil rights movement, the LGBT rights movement, and essentially every other agent of progress depended just as much on privacy as on the freedoms of speech and assembly.

Our nation was founded on the premise of limited government. Before the Revolution, agents of the Crown used so-called general warrants as authority to conduct untargeted sweeps for the terrorists of their day, the men who would become our Founding Fathers. As a nation, we agreed that would never happen again and the Fourth Amendment was designed to limit the power of law enforcement.

What the FBI is asking for in this case isn’t quite analogous to a general warrant, but it’s one small step removed. Privacy is not dead, but if the FBI’s legal argument wins the day in San Bernardino, the government will gain a vast new power to compel companies to deliberately weaken the tools that ordinary, law-abiding citizens use.

You point out that it will be interesting to see how Apple responds when they get a similar request from the Chinese government. Indeed it will. But that’s the point. We’re not China and the FBI needs to stop trying to build a police state. You say you still support strong encryption, but you’re advocating in favor of a legal regime that’s trying desperately to ban it.

The Framers of the Constitution and the Bill of Rights were big fans of encryption (Jefferson himself invented a number of strong cyphers), of anonymous speech (think, the Federalist Papers), and of course liberty. Ordering Apple to create a master key would be a betrayal of the values our nation was founded on.

To recap: Apple has been ordered by a court to help the FBI gain access to content on a phone used by jihadists who carried out a mass shooting. The owner of the phone (it’s a work phone) has given permission. Apple may already have the ability to do this. The request would apply only to this phone, not to all Apple products, since the technology requires physical access (e.g. you have to possess the phone). After refusing initial FBI requests for assistance, Apple was served a court order and has refused to comply with it.

Apple’s actions occur in a period of heightened threats of jihadist actions against U.S. citizens and the citizens of other nations, and at a time of widespread global outrage over NSA surveillance and the lax privacy practices of leading technology companies — most of whom are American. Apple is trying to distance itself from these concerns by taking a stance against the FBI.

This is not a good story. Let’s not pretend that there is something noble about this refusal. The motives are commercial.

The FBI can be histrionic in its efforts to sway public opinion on encryption, but in this case, the government has been measured in its actions. Privacy advocates, who have objected to every move to accommodate technology to law enforcement’s needs for the last 30 years, are displeased with the FBI’s requests.

The current tendency in American politics is to go to extremes and to make up facts (like encryption “backdoors”). The encryption debate requires balance and objectivity, however. We need to balance concerns over privacy with concerns over public safety — neither should predominate. We need a factual basis for decisions on this balance — and that includes understanding what other countries want to do on encryption, how the technology actually works, and how little privacy people now have online.

We have three questions to think about: How do we resuscitate privacy in this country without stifling innovation or security? How do we keep Americans safe when any plan that doesn’t involve magical thinking will require lawful access to communications (with congressional and court oversight)? How do we build international agreement on data flows and lawful access when there is so much distrust, warranted or not, of both American agencies and companies. None of these are easy, and the Apple case hasn’t helped move us towards a serious solution.

I understand that Apple is worried about slowing growth, but this case should not have been a problem. Saying yes to the FBI would not create risk to privacy and might reduce risk to citizens. The same document that says Americans are protected from “unreasonable” searches also makes clear that it is Congress and the courts that decide what is reasonable. Apple has received a reasonable request from a court for assistance. It should comply.

Jason M. Breslow

Jason M. Breslow , Former Digital Editor

More stories.

4205_SG001

Families of Uvalde Shooting Victims Sue Texas DPS Officers for Waiting To Confront Gunman

4212_SG_005

Where Does School Segregation Stand, 70 Years After Brown v. Board of Education?

00004212_SG001

‘A Dangerous Assignment’ Director and Reporter Discuss the Risks in Investigating the Powerful in Maduro’s Venezuela

Roberto Deniz A Dangerous Assignment

‘It Would Have Been Easier To Look Away’: A Journalist’s Investigation Into Corruption in Maduro’s Venezuela

Documenting police use of force, get our newsletter, follow frontline, frontline newsletter, we answer to no one but you.

You'll receive access to exclusive information and early alerts about our documentaries and investigations.

I'm already subscribed

The FRONTLINE Dispatch

Don't miss an episode. sign-up for the frontline dispatch newsletter., sign-up for the unresolved newsletter..

CloudNine

  • Request a Demo
  • 713-462-6464

eDiscovery Daily Blog

Here’s why whether apple provides a backdoor to iphones may not matter: data privacy trends.

apple backdoor case study

  • January 20, 2020

Last week , we covered the government’s latest attempt (and Apple’s resistance) to get Apple to assist in unlocking the iPhones of a mass shooter – this time, with regard to password-protected iPhones used by Mohammed Saeed Alshamrani, who is suspected of killing three people last month in a shooting at a Navy base in Pensacola, Florida.  Ultimately, however, it may not matter whether Apple helps the government or not.

According to Business Insider ( The Justice Department is demanding that Apple make it easier to unlock suspects’ iPhones, but experts say it can do that without Apple’s cooperation. Here’s how. , written by Aaron Holmes), according to cybersecurity experts, new technologies have made it even easier for investigators to crack locked iPhones, even without help from Apple.

Last week, Attorney General William Barr said during a press conference on Monday that Apple had not helped the FBI crack into the password-protected iPhones used by Alshamrani.

“We have asked Apple for their help in unlocking the shooter’s iPhones. So far Apple has not given us any substantive assistance,” Barr said , next to a poster with a picture of the iPhones. “This situation perfectly illustrates why it is critical that investigators be able to get access to digital evidence once they have obtained a court order based on probable cause.”

For their part, Apple disputed Barr’s assessment that it has failed to provide law enforcement with “substantive assistance” in unlocking the password-protected iPhones used by the shooting suspect at a Navy base in Pensacola, Florida, last month, but still refused his main request to provide a backdoor.  Apple stated it “produced a wide variety of information associated with the investigation” after the FBI’s initial request on Dec. 6. The company said it provided “gigabytes of information” including “iCloud backups, account information and transactional data for multiple accounts” in response to further requests that month.

“We have always maintained there is no such thing as a backdoor just for the good guys,” Apple said in a statement. “Backdoors can also be exploited by those who threaten our national security and the data security of our customers. Today, law enforcement has access to more data than ever before in history, so Americans do not have to choose between weakening encryption and solving investigations. We feel strongly encryption is vital to protecting our country and our users’ data.”

In an interview with Business Insider, Chris Howell, CTO of Wickr said he understood why Apple wouldn’t intentionally build a backdoor into the iPhone as the FBI has requested.

“As a technologist I can tell you that there is no security mechanism that can discriminate between a hacker trying to crack it and a law enforcement officer trying to do the same thing. Either we secure it or we don’t, it’s that simple.”

However, according to The Wall Street Journal , the cybersecurity company Grayshift sells an iPhone hacking device for $15,000 , and Israel’s Cellebrite sells a similar device.  Tech companies are constantly trying to develop more secure devices and platforms to win costumers’ trust, and are therefore reticent to build backdoors that would easily crack encrypted services. Similarly, companies like Grayshift and Cellebrite are constantly honing methods of cracking devices, which are kept secret.

The iPhone was long seen as uncrackable, but recent advances have changed that — one county in Georgia that purchased a Grayshift device was able to crack 300 phones in one year, The Wall Street Journal reported.

One commenter to our post last week stated “if I was a terrorist I’d throw away my iPhoneX and get an iPhone 11”.  Staying ahead of crackers and hackers seems to be a continual battle that device managers and website providers face daily.  And, if we think this issue only applies to discovery of devices in cases involving mass shooters, it could easily apply to discovery in any type of case today where a custodian of a device has something to hide.  Like this Fifth Amendment case that we covered last year and will discuss in our webcast on January 29 .

So, what do you think?  Should companies like Apple and Facebook provide backdoor access to their encrypted technology to investigators?  Or are there bigger privacy concerns at play here?  Please share any comments you might have or if you’d like to know more about a particular topic.

apple backdoor case study

Sponsor: This blog is sponsored by CloudNine , which is a data and legal discovery technology company with proven expertise in simplifying and automating the discovery of data for audits, investigations, and litigation. Used by legal and business customers worldwide including more than 50 of the top 250 Am Law firms and many of the world’s leading corporations, CloudNine’s eDiscovery automation software and services help customers gain insight and intelligence on electronic data.

WHAT CLIENTS ARE SAYING ABOUT CLOUDNINE

Great value product..

“Offers the major features we were looking for, at a fraction of pricing of other competitors .”

I used CloudNine as part of fraud investigation for email searches.

“…The tag function made it easy to flag the search results. I was impressed with the ease of use for a first-time user . The speed and ease of loading data and being able to review it immediately is a tremendous advantage over other Cloud-based platforms.”

Excellent tool with outstanding support

“CloudNine Review is excellent, it takes the best of the (market leader) review solution and leaves out all of the fiddly bits that make that product excruciating to use. Their upload and processing is automatic, and their pricing structure is the best I’ve seen.”

Great software that is easy to log on, user-friendly, has a great layout, and is easy to navigate.

“…CloudNine is great at searching documents, including tagging, and exporting. Software tailored to our business needs and streamlined the task at hand. ”

Discovery Production

This software is easy to use and allows us to upload and download documents as they become ready, saving us both time and money.

READY TO SEE THE SOFTWARE IN ACTION?

Request a CloudNine demo and see how easy eDiscovery can be!

apple backdoor case study

CloudNine empowers legal, information technology, and business professionals with eDiscovery automation software and professional services that simplify litigation, investigations, and audits for law firms and corporations.

Facebook

apple backdoor case study

Apple vs. FBI Case Study

  • Markkula Center for Applied Ethics
  • Focus Areas
  • Business Ethics
  • Business Ethics Resources

Apple iPhone image link to story

Business & government struggle over encryption’s proper place.

Apple iPhone

Apple iPhone

Kiichiro Sato/AP Photo

In the wake of the December 2015 terrorist attack in San Bernardino, attention turned to the perpetrator’s iPhone.  A federal judge asked Apple, maker of the iPhone, to provide “reasonable technical assistance” to the FBI in accessing the information on the phone with that hope of discovering additional threats to national security.

Apple provided the FBI with data it had in their possession and sent Apple engineers to advise the FBI, but refused to comply with the court order to bypass the phone’s security measures: specifically the 4-digit login code and a feature that erases all data after ten incorrect attempts.  The FBI argued that the bypass could only be used for this phone, this one time.  The agency also cited national security concerns, given the phone may lead to better understanding the attack and preventing further incidents.

Apple CEO Tim Cook issued a public letter reiterating Apple’s refusal to cooperate.  Cook advocated for the benefits of encryption in society to keep personal information safe.  He stated that creating the backdoor entry into the iPhone would be akin to creating a master key capable of accessing the tens of millions of iPhones in the U.S. alone.  Cook also had concerns that the FBI was outstepping its bounds - by using the court system to expand its authority - and believed the case should be settled after public debate and legislative action through Congress instead.

Public opinion polls on the issue were split.  A number of major tech firms filed amicus briefs in support of Apple.  The White House and Bill Gates stood behind the FBI.  In anticlimactic fashion, the FBI withdrew its request a day before the hearing, claiming it no longer needed Apple’s help to assess the phone.  It is speculated that an Israeli tech firm, Cellebrite, helped the FBI gain assess.

  • Was Apple wrong for not complying with the FBI’s request? If so, why?  If not, why not?
  • What ethical issues are involved in this case? Please consult our Framework for Ethical Decision Making for an overview of modes of moral reasoning.
  • Who are the stakeholders in this situation?
  • Apple’s values are listed on the bottom of its home page at apple.com . Is the company’s decision consistent with its values?  Is that important?

Apple vs. Feds: Is iPhone Privacy a Basic Human Right?

Apple CEO Tim Cook didn’t come to his post with an activist agenda, yet when law enforcement officials began pressuring the company to hand over iPhone users’ data without their permission, Cook took what he believed was a moral stance to protect consumers’ privacy.

He knew taking this position would embroil the company in an ugly fight—one that risked alienating some shareholders—but he felt strongly that Apple should champion its customers’ basic human right to privacy.

“We believe that a company that has values and acts on them can really change the world,” Cook said in 2015, a year after Apple debuted new privacy measures that blocked law enforcement from accessing its customers’ data. “There is an opportunity to do work that is infused with moral purpose.” He said shareholders who were only looking for a return on investment “should get out of the stock.”

"What is new is the expectation that a company will have a position on social and political issues."

A Harvard Business School case study and its revision, Apple: Privacy vs. Safety (A) and (B) , illustrates the complex ramifications that companies should consider when putting their stake in the ground on challenging societal issues like privacy. The authors of the case offer a suggestion for CEOs: Few corporations can expect to steer clear of the lightning-rod issues of the day, so perhaps it’s best to meet them head on as part of the job.

“What is new is the expectation that a company will have a position on social and political issues,” says Nien-hê Hsieh, the Kim B. Clark Professor of Business Administration at HBS, who coauthored the case. Staking out a clear social position can actually help a company’s bottom line, boosting employee morale, making workers more productive, and attracting customers who feel they can trust the company, say the authors.

Hsieh wrote the original case and its 2021 revision and expansion with Henry McGee, senior lecturer of business administration at HBS; Christian Godwin, a researcher in the HBS case writing group; and former HBS case researcher Sarah McAra.

Customer privacy comes under fire

An industrial engineer known for his practical work style and deep manufacturing expertise, Cook has used his position to take on several hot-button topics, including fighting discrimination against people who identify as gay, lesbian, bisexual, and transgender.

Cook began championing privacy as controversy was swirling around an extensive US government surveillance program that had been disclosed by former National Security Agency contractor Edward Snowden. Snowden had leaked information indicating that the government was collecting private consumer data stored by internet and telecom corporations. The shocking revelations caused many consumers, businesses, and governments to take their business away from US tech companies; A New America analysis estimated the loss to US cloud computing over the Snowden disclosures at $35 billion.

Meanwhile, an influx of cybercrime was propelling government officials to develop tougher security measures. And law enforcement officials were increasingly asking for access to iPhones, powerful handheld devices that stored information that may help them solve crimes.

With the iOS 8 operating system, unveiled in 2014, Cook and Apple responded to these brewing forces with an improved encryption system that put privacy in the hands of the consumer. The company no longer possessed a master key to devices that government officials could request. Instead, when a user created a passcode, that passcode was combined with a unique key that was encrypted—and it could not be accessed by Apple.

Cook hoped the move would remove Apple from the middle. “If law enforcement wants something, they should go to the user and get it. It’s not for me to do that,” Cook explained at the time.

Law enforcement officials pushed back, making the case that technology companies should provide a backdoor into suspects’ phones. “Sophisticated criminals will come to count on these means of evading detection. It’s the equivalent of a closet that can’t be opened,” said James Comey, director of the Federal Bureau of Investigation, in 2014.

The issue came to a head on December 2, 2015, when a husband and wife claiming to be affiliated with the Islamic State opened fire at a party in San Bernardino, California, killing 14 people. Police found one of the shooter’s phones, an iPhone 5C running on the iOS 9 operating system. When a judge ordered Apple to help build decryption software to unlock the phone, the company refused. Cook vowed to take the issue all the way to the Supreme Court. While that standoff never happened, because the government dropped the case, it showed how seriously Cook took the situation.

International pressures test Cook’s resolve

Although Cook stood by his convictions in the US, other countries posed different challenges.

China, for instance, is an important market for Apple. Officials there wanted assurances from the company that it wasn’t sharing its users’ data with the US government. Cook ultimately allowed some security audits, inflaming critics who claimed the deal violated Apple’s commitment to privacy.

And, in 2017, the Chinese government cracked down on unlicensed virtual private network apps that allowed consumers to get around its censors. The Chinese government requires these apps to operate with a license that is expensive and difficult to obtain, the case explains, and the government asked Apple to take them off its Chinese App Store. Apple complied, eliciting blowback from free-speech advocates, showing how different and difficult it was for Cook to hold firm on privacy in a country where rights are valued differently.

"Even if you are not taking a stand on human rights, as Cook has done, you are going to wade into these debates."

“We follow the law wherever we do business … we strongly believe participating in markets and bringing benefits to customers is in the best interest of the folks there,” said Cook at the time, in defense of Apple’s decisions.

A follow-on case written last year as the COVID-19 pandemic raged explores how Cook was presented with another privacy quandary: Public health officials wanted to use Apple’s products to conduct coronavirus contact tracing. Some governments wanted more control than Apple allowed, including location-tracking data that was protected as part of Apple’s encryption. Apple again refused.

In France, where the government had petitioned Apple to exempt contact tracing from the phones’ typical privacy protections, Digital Minister Cedric O said, “A company that has never been in better economic shape is not helping the government fight the crisis. We will remember that.”

Lessons for leaders

The case underscores an important point: Good leadership should be about more than the bottom line, the authors say.

“Even if you are not taking a stand on human rights, as Cook has done, you are going to wade into these debates, and not just if you are at a technology company,” says Hsieh.

After all, it’s difficult for leaders of most companies to completely avoid all political and social issues, particularly since consumers increasingly expect companies to take a position. “It’s very hard not to get involved and take a stand. There is a growing expectation that companies will do something,” Hsieh says.

The researchers suggest companies develop “ethical capacity” by hiring ethics experts to help shape the organization’s decision-making processes. Many companies already elevate an ethics focus to the board level, creating committees on corporate responsibility and accountability. These structures can help create trust in a company’s products among the public, they say.

In addition, leaders need to continually articulate their values to stakeholders—and be willing to change their perspectives when challenged. “You have to constantly re-evaluate your position in life,” McGee says.

Feedback or ideas to share? Email the Working Knowledge team at [email protected] .

Image: Unsplash/Wesson Wangi

  • 06 May 2024
  • Research & Ideas

The Critical Minutes After a Virtual Meeting That Can Build Up or Tear Down Teams

  • 24 Jan 2024

Why Boeing’s Problems with the 737 MAX Began More Than 25 Years Ago

  • 21 May 2024

What the Rise of Far-Right Politics Says About the Economy in an Election Year

  • 22 May 2024

Banned or Not, TikTok Is a Force Companies Can’t Afford to Ignore

  • 15 Aug 2023
  • Cold Call Podcast

Ryan Serhant: How to Manage Your Time for Happiness

Nien-he Hsieh

  • Corporate Social Responsibility and Impact

Sign up for our weekly newsletter

caseism

Building a “Backdoor” to the iPhone: An Ethical Dilemma Case Solution & Answer

Home » Case Study Analysis Solutions » Building a “Backdoor” to the iPhone: An Ethical Dilemma

Building a “Backdoor” to the iPhone: An Ethical Dilemma Case Solution

In February 2016, Tim Cook, Apple’s ceo, demanded a U.S. Federal Court order for Apple to help the Federal Bureau of Investigation (FBI) in a case including thought global terrorism. The federal government desired Apple to offer the FBI with connectivity to encrypted information on an Apple item, the iPhone.

Cook’s rejection to give in to the federal government’s needs drew sturdy public dispute, pitting the supporters of nationwide security versus those in favour of consumers’ digital personal privacy and security. The case conjured up an ethical dilemma dealt with by management in problems including right-versus-right choices. How should Cook solve the dilemma?

This case is about SECURITY & PRIVACY

PUBLICATION DATE: April 28, 2016

Related Case Solutions:

Default Thumbnail

LOOK FOR A FREE CASE STUDY SOLUTION

The Case Centre logo

Product details

apple backdoor case study

Teaching delivery modes

apple backdoor case study

TheCaseSolutions.com

  • Order Status
  • Testimonials
  • What Makes Us Different

Building A Backdoor To The Iphone: An Ethical Dilemma Harvard Case Solution & Analysis

Home >> Harvard Case Study Analysis Solutions >> Building A Backdoor To The Iphone: An Ethical Dilemma

Q 1: Which course of action will do the most good and the least harm?

ANS :iPhone is Apple’s leading product. Apple and other rights groups promote the protection of customer’s digital privacy. While on the other side, the Federal Bureauof Investigation (FBI) expect Apple and other forms of Technology Companies for protecting the national security.

TheFederal Bureau Investigation and US governmentdemandedApple todevelop a major backdoorsoftware in the iPhone product which will help them to bypassa terrorist’s security protocol in the iPhones and will provide them access to their data.

But on the other side, the CEO of Apple, Tim Cook refused the demand of Federal Bureau Investigation because Apple has a commitment to the customer that their information is private and owns responsibility of insecurity in the information. (Jayakumar, 2017)

If the CEO of Apple, Tim cook accepted the offered to provide the backdoor in the iPhone to the Federal BureauInvestigation,this will help the society in a long run as well as short run as terrorist’s data can be captured and further incidents would be saved. This will also harm the company in some situation which impacts on the company reputation and that will result in the stock prices decreasing.

Building A Backdoor To The Iphone An Ethical Dilemma Harvard Case Solution & Analysis

Some advantages of the decision he took when he accepted the offer are as below:

  • The backdoor plays animportant role that helps to access a terrorist’s data and helps to save peoples’ lives so that they can easily live in the world without any fear of terrorist attacks.
  • The backdoor helps the authorities such as the Federal Bureau Investigation in getting the information related to the terrorism and preventinga terrorist’s attacks. The information getting by the data of a terrorist’s phone data which will help to save the losses from the attacks.
  • The backdoor of iPhone helps in giving the direction of an ongoing investigation andhow to assistthe main authorities in apprehending a terrorist’s and their direction of attack.
  • It also helps to access a terrorist’s phone and get the complete information related to the future planning for a terrorist’s attacks and havehelped to save the attacks by information.

Here also some disadvantages in the favour of accepting the offer from the side of Federal Bureau Investigation and US government. These are given below:

  • The backdoor of the iPhone first damages the customer satisfaction because the consumer’s privacy is not secured.
  • This will play a role in the stock prices falling for the organization.
  • When allowed to accept to get the backdoor of the iPhone, the user’s information security becomes at risk. Because the information operating system (iOS) software provides complete information to the FBI and other intelligence forces for investigation even of the data that is termed as private.
  • The backdoor it totally a clear violation of privacy. That reduces the trust level of the customer to the company which also reduces the company’s reputation.
  • The backdoor of iPhone verdures the users more helpless to the hackers. (The pros and cons of the backdoor of iPhone, 2018)...............

This is just a sample partical work. Please place the order on the website to get your own originally done case solution.

Related Case Solutions & Analyses:

apple backdoor case study

Hire us for Originally Written Case Solution/ Analysis

Like us and get updates:.

Harvard Case Solutions

Search Case Solutions

  • Accounting Case Solutions
  • Auditing Case Studies
  • Business Case Studies
  • Economics Case Solutions
  • Finance Case Studies Analysis
  • Harvard Case Study Analysis Solutions
  • Human Resource Cases
  • Ivey Case Solutions
  • Management Case Studies
  • Marketing HBS Case Solutions
  • Operations Management Case Studies
  • Supply Chain Management Cases
  • Taxation Case Studies

More From Harvard Case Study Analysis Solutions

  • Toyota Case Analysis
  • Fashion Business Research Project
  • Read Two Case Studies
  • Restructuring of Chinas Automobile Industry after Chinas Entry into WTO
  • Guide to the U.S. Presidents Administration
  • London Symphony Orchestra (B)
  • Growing Up in China: The Financing of BabyCare Ltd.

Contact us:

apple backdoor case study

Check Order Status

Service Guarantee

How Does it Work?

Why TheCaseSolutions.com?

apple backdoor case study

  • LISTEN & FOLLOW
  • Apple Podcasts
  • Google Podcasts
  • Amazon Music

Your support helps make our show possible and unlocks access to our sponsor-free feed.

The hack that almost broke the internet

Jeff Guo, photographed for NPR, 2 August 2022, in New York, NY. Photo by Mamadi Doumbouya for NPR.

Nick Fountain

Headshot of Jess Jiang

Emma Peaslee

Guinness World Records challenge Jenga enthusiasts to build 30 levels of the popular building game in the fastest time possible at The Walkways, Tower Bridge on March 22, 2005 in London.

Last month, the world narrowly avoided a cyberattack of stunning ambition. The targets were some of the most important computers on the planet. Computers that power the internet. Computers used by banks and airlines and even the military.

What these computers had in common was that they all relied on open source software.

A strange fact about modern life is that most of the computers responsible for it are running open source software. That is, software mostly written by unpaid, sometimes even anonymous volunteers. Some crucial open source programs are managed by just a single overworked programmer . And as the world learned last month, these programs can become attractive targets for hackers.

In this case, the hackers had infiltrated a popular open source program called XZ. Slowly, over the course of two years, they transformed XZ into a secret backdoor. And if they hadn't been caught, they could have taken control of large swaths of the internet.

On today's show, we get the story behind the XZ hack and what made it possible. How the hackers took advantage of the strange way we make modern software. And what that tells us about the economics of one of the most important industries in the world.

This episode was hosted by Jeff Guo and Nick Fountain. It was produced by Emma Peaslee and edited by Jess Jiang. It was engineered by Cena Loffredo and fact checked by Sierra Juarez. Alex Goldmark is Planet Money's executive producer.

Help support Planet Money and hear our bonus episodes by subscribing to Planet Money+ in Apple Podcasts or at plus.npr.org/planetmoney .

Always free at these links: Apple Podcasts , Spotify , Google Podcasts , the NPR app or anywhere you get podcasts.

Find more Planet Money: Facebook / Instagram / TikTok / Our weekly Newsletter .

Music: NPR Source Audio - "Strange Tango," "Warped Worlds," and "Detective Dan"

IMAGES

  1. The Apple Backdoor Explained

    apple backdoor case study

  2. APPLE CASE STUDY by Ronald Everson on Prezi

    apple backdoor case study

  3. Building a "Backdoor" to the iPhone: An Ethical Dilemma Case Solution

    apple backdoor case study

  4. Apple Case Study by Anna Bryant on Prezi

    apple backdoor case study

  5. Apple Backdoor

    apple backdoor case study

  6. Snowden interviewer says NSA leaker launched “global debate”

    apple backdoor case study

VIDEO

  1. Keynote Session: Key Challenges and Success Factors of Banks’ Move to Public Cloud... Andrew Agerbak

  2. apple 🍎 iphone backdoor replacement #trending #repring #shorts

  3.  Apple's Case Study

  4. Option Block 916: The Berkshire Backdoor to Apple

COMMENTS

  1. Building a "Backdoor" to the iPhone: An Ethical Dilemma

    The government wanted Apple to provide the FBI with access to encrypted data on an Apple product, the iPhone. Cook's refusal to acquiesce to the government's demands drew strong public debate, pitting the proponents of national security against those in favour of customers' digital privacy and security. The case invoked an ethical dilemma faced ...

  2. The FBI & Apple Security vs. Privacy

    Case Study UT Star Icon. ... Second, such a backdoor, once created, could fall into the wrong hands and threaten the privacy of all iPhone owners. Finally, it would set a dangerous precedent; law enforcement could repeatedly require businesses such as Apple to assist in criminal investigations, effectively making technology companies an agent ...

  3. Apple-FBI encryption dispute

    James Comey, former FBI director Tim Cook, chief executive officer of Apple Inc. Cook and former FBI Director Comey have both spoken publicly about the case.. In 1993, the National Security Agency (NSA) introduced the Clipper chip, an encryption device with an acknowledged backdoor for government access, that NSA proposed be used for phone encryption. The proposal touched off a public debate ...

  4. (PDF) Building A Backdoor to the iPhone: What Dilemmas ...

    felt Apple should comply with the FBI request for a backdoor while 38% felt Apple should not . ... In this case study, students use four questions developed by Badaracco (1992) as a framework for ...

  5. Balancing Security and Privacy in the Age of Encryption: Apple v. FBI

    According to Apple, the answer is a big, fat, thespian no. Apple not only refused to comply but also published an open letter to the public, advising people of the 'chilling' implications of a security backdoor, writing that, "this demand would undermine the very freedoms and liberty our government is meant to protect." Apple warned ...

  6. The FBI Wanted a Backdoor to the iPhone. Tim Cook Said No

    Apr 16, 2019 12:43 PM. The FBI Wanted a Back Door to the iPhone. Tim Cook Said No. The agency wanted to crack the iPhone of Syed Farook, a suspect in the 2015 San Bernardino shooting. The Apple ...

  7. Apple, The FBI And iPhone Encryption: A Look At What's At Stake

    Apple has described the FBI's request as amounting to asking for "a backdoor to the iPhone" — a flaw in a security system purposefully designed to help law enforcement break in for investigations.

  8. Apple vs. FBI: Here's everything you need to know (FAQ)

    Apple argues that introducing a backdoor into the iPhone wouldn't just make Farook's phone insecure, it would make every iPhone weaker. As pointed out by The Guardian , the argument that Apple is ...

  9. Apple's Plan to "Think Different" About Encryption Opens a Backdoor to

    Apple has announced impending changes to its operating systems that include new "protections for children" features in iCloud and iMessage. If you've spent any time following the Crypto Wars, you know what this means: Apple is planning to build a backdoor into its data storage system and its messaging system.

  10. Building a "Backdoor" to the iPhone: An Ethical Dilemma

    The government wanted Apple to provide the FBI with access to encrypted data on an Apple product, the iPhone. Cook's refusal to acquiesce to the government's demands drew strong public debate, pitting the proponents of national security against those in favour of customers' digital privacy and security. The case invoked an ethical dilemma faced ...

  11. Who's Right In Apple's Fight with the FBI?

    The Apple case is the third time I've seen this movie. In the early 1990s, there was a fight to make phone companies build in surveillance into their switches. The result was the Communications ...

  12. Building a 'Backdoor' to the iPhone: An Ethical Dilemma

    In February 2016, Tim Cook, Apple's chief executive officer, challenged a US Federal Court order for Apple to assist the Federal Bureau of Investigation (FBI) in a case involving suspected international terrorism. The government wanted Apple to provide the FBI with access to encrypted data on an Apple product, the iPhone.

  13. Here's Why Whether Apple Provides a Backdoor to iPhones May Not Matter

    For their part, Apple disputed Barr's assessment that it has failed to provide law enforcement with "substantive assistance" in unlocking the password-protected iPhones used by the shooting suspect at a Navy base in Pensacola, Florida, last month, but still refused his main request to provide a backdoor. Apple stated it "produced a wide ...

  14. Apple vs. FBI Case Study

    Apple CEO Tim Cook issued a public letter reiterating Apple's refusal to cooperate. Cook advocated for the benefits of encryption in society to keep personal information safe. He stated that creating the backdoor entry into the iPhone would be akin to creating a master key capable of accessing the tens of millions of iPhones in the U.S. alone.

  15. How did Americans Really Think About the Apple/FBI Dispute? A Mixed

    ABSTRACT. Second-level agenda-setting suggests that news media influence how we think. As a case study examining the nature and effects of mainstream news media's coverage of the 2015 Apple/FBI dispute about data privacy versus national security, this study found via content analysis that a majority of articles covering the dispute (73.7%) made the same potentially misleading claim about how ...

  16. Apple vs. Feds: Is iPhone Privacy a Basic Human Right?

    A case study by Nien-hê Hsieh and Henry McGee examines how Apple CEO Tim Cook turned calls for data access into a rallying cry for privacy, and the complexities that followed. ... making the case that technology companies should provide a backdoor into suspects' phones. ... in defense of Apple's decisions. A follow-on case written last ...

  17. Building a "Backdoor" to the Iphone: an Ethical Dilemma

    It sought Apple's assistance in unlocking the encrypted iPhone used by Farook. The FBI wanted Apple to build what became known as "a backdoor" to the iPhone. Talks between lawyers of the Obama administration and Apple went on for two months,34 but in the end, Apple refused to acquiesce to the FBI's demand.

  18. Case

    Case - Building a Backdoor to the Apple iPhone - An Ethical Dilemma - Free download as Word Doc (.doc / .docx), PDF File (.pdf), Text File (.txt) or read online for free. In February 2016, Tim Cook, CEO of Apple, refused the FBI's demand to build a "backdoor" into the iPhone operating system (iOS) that would help access data on the San Bernardino shooter's iPhone.

  19. Building a "Backdoor" to the iPhone: An Ethical Dilemma Case Study

    In February 2016, Tim Cook, Apple's ceo, demanded a U.S. Federal Court order for Apple to help the Federal Bureau of Investigation (FBI) in a case including thought global terrorism. The federal government desired Apple to offer the FBI with connectivity to encrypted information on an Apple item, the iPhone.

  20. Building a 'Backdoor' to the iPhone: An Ethical Dilemma

    Product details. Building a 'Backdoor' to the iPhone: An Ethical Dilemma. Teaching note. -. Reference no. 8B16M077. Subject category: Strategy and General Management. Authors: Tulsi Jayakumar (S P Jain Institute of Management & Research); Surya Tahora (S P Jain Institute of Management & Research) Published by: Ivey Publishing.

  21. Apple Backdoor

    Executive Summary The purpose of this case study is to analyze the ethical problems and respond to the issue faced by Apple. It highlights the issue between privacy and security, and how much digital security measures are required to when it comes to terrorism concerns that affect the whole nation.

  22. Case Study No 1 Apple and the Backdoor to the IPhone.pdf

    4 The U.S. government asked Apple to create something (i.e. a backdoor to the iPhone) the company simply did not have and something it considered too dangerous to create. The software (a version of the iOS) that the FBI wanted Apple to create for one iPhone (the Bernhardino terrorist ' s phone), would have the potential to unlock any iPhone in anyone ' s physical possession.

  23. Building A Backdoor To The Iphone: An Ethical Dilemma Harvard Case

    Building A Backdoor To The Iphone: An Ethical Dilemma Case Solution,Building A Backdoor To The Iphone: An Ethical Dilemma Case Analysis, Building A Backdoor To The Iphone: An Ethical Dilemma Case Study Solution, Q 1: Which course of action will do the most good and the least harm? ANS:iPhone is Apple's leading product. Apple and other rights groups promote the

  24. The hack that almost broke the internet

    In this case, the hackers had infiltrated a popular open source program called XZ. Slowly, over the course of two years, they transformed XZ into a secret backdoor. And if they hadn't been caught ...