Government Surveillance and Internet Search Behavior

53 Pages Posted: 23 Mar 2014 Last revised: 15 Mar 2017

Alex Marthews

Digital Fourth / Restore The Fourth

Catherine E. Tucker

Massachusetts Institute of Technology (MIT) - Management Science (MS)

Date Written: February 17, 2017

This paper displays data from the US and its top 40 trading partners on the search volume of select keywords from before and after the surveillance revelations of June 2013, to analyze whether Google users' search behavior changed as a result. The surveillance revelations are treated as an exogenous shock in information about how closely users' internet searches were being monitored by the US government. Each search term was independently rated for its degree of privacy sensitivity along multiple dimensions. Using panel data, our results suggest that search terms that were deemed both personally-sensitive and government-sensitive were most negatively affected by the PRISM revelations, highlighting the interplay between privacy concerns relating to both the government and the private individual. Perhaps surprisingly, the largest `chilling effects' were not found in countries conventionally treated as intelligence targets by the US, but instead in countries that were more likely to be considered allies of the US. We show that this was driven in part by a fall in searches on health-related terms. Suppressing health information searches potentially harms the health of search engine users and, by reducing traffic on easy-to-monetize queries, also harms search engines' bottom line. In general, our results suggest that there is a chilling effect on search behavior from government surveillance on the Internet, and that government surveillance programs may damage the profitability of US-based internet firms relative to non-US-based internet firms.

Keywords: surveillance, Snowden, prism, chilling effects, international trade

JEL Classification: D12, D78, E65, F14, H56, M38

Suggested Citation: Suggested Citation

Alex Marthews (Contact Author)

Digital fourth / restore the fourth ( email ).

28 Temple St. Belmont, MA 02478 United States

Massachusetts Institute of Technology (MIT) - Management Science (MS) ( email )

100 Main St E62-536 Cambridge, MA 02142 United States

HOME PAGE: http://cetucker.scripts.mit.edu

Do you have a job opening that you would like to promote on SSRN?

Paper statistics, related ejournals, io: regulation, antitrust & privatization ejournal.

Subscribe to this fee journal for more curated articles on this topic

Innovation Law & Policy eJournal

Information systems: behavioral & social methods ejournal.

The Dangers of Surveillance

  • Neil M. Richards
  • Addressing the Harm of Total Surveillance: A Reply to Professor Neil Richards  by  Danielle Keats Citron , David Gray
  • See full issue

From the Fourth Amendment to George Orwell’s Nineteen Eighty-Four , and from the Electronic Communications Privacy Act to films like Minority Report and The Lives of Others , our law and culture are full of warnings about state scrutiny of our lives. These warnings are commonplace, but they are rarely very specific. Other than the vague threat of an Orwellian dystopia, as a society we don’t really know why surveillance is bad and why we should be wary of it. To the extent that the answer has something to do with “privacy,” we lack an understanding of what “privacy” means in this context and why it matters. We’ve been able to live with this state of affairs largely because the threat of constant surveillance has been relegated to the realms of science fiction and failed totalitarian states.

But these warnings are no longer science fiction. The digital technologies that have revolutionized our daily lives have also created minutely detailed records of those lives. In an age of terror, our government has shown a keen willingness to acquire this data and use it for unknown purposes. We know that governments have been buying and borrowing private-sector databases, and we recently learned that the National Security Agency (NSA) has been building a massive data and supercomputing center in Utah, apparently with the goal of intercepting and storing much of the world’s Internet communications for decryption and analysis.

Although we have laws that protect us against government surveillance, secret government programs cannot be challenged until they are discovered. And even when they are, our law of surveillance provides only minimal protections. Courts frequently dismiss challenges to such programs for lack of standing, under the theory that mere surveillance creates no harms. The Supreme Court recently reversed the only major case to hold to the contrary, in Clapper v. Amnesty International USA , finding that the respondents’ claim that their communications were likely being monitored was “too speculative.”

But the important point is that our society lacks an understanding of why (and when) government surveillance is harmful. Existing attempts to identify the dangers of surveillance are often unconvincing, and they generally fail to speak in terms that are likely to influence the law. In this Article, I try to explain the harms of government surveillance. Drawing on law, history, literature, and the work of scholars in the emerging interdisciplinary field of “surveillance studies,” I offer an account of what those harms are and why they matter. I will move beyond the vagueness of current theories of surveillance to articulate a more coherent understanding and a more workable approach.

At the level of theory, I will explain why and when surveillance is particularly dangerous and when it is not. First, surveillance is harmful because it can chill the exercise of our civil liberties. With respect to civil liberties, consider surveillance of people when they are thinking, reading, and communicating with others in order to make up their minds about political and social issues. Such intellectual surveillance is especially dangerous because it can cause people not to experiment with new, controversial, or deviant ideas. To protect our intellectual freedom to think without state over-sight or interference, we need what I have elsewhere called “intellectual privacy.” A second special harm that surveillance poses is its effect on the power dynamic between the watcher and the watched. This disparity creates the risk of a variety of harms, such as discrimination, coercion, and the threat of selective enforcement, where critics of the government can be prosecuted or blackmailed for wrongdoing unrelated to the purpose of the surveillance.

At a practical level, I propose a set of four principles that should guide the future development of surveillance law, allowing for a more appropriate balance between the costs and benefits of government surveillance. First, we must recognize that surveillance transcends the public/private divide . Public and private surveillance are simply related parts of the same problem, rather than wholly discrete. Even if we are ultimately more concerned with government surveillance, any solution must grapple with the complex relationships between government and corporate watchers. Second, we must recognize that secret surveillance is illegitimate and prohibit the creation of any domestic-surveillance programs whose existence is secret. Third, we should recognize that total surveillance is illegitimate and reject the idea that it is acceptable for the government to record all Internet activity without authorization. Government surveillance of the Internet is a power with the potential for massive abuse. Like its precursor of telephone wiretapping, it must be subjected to meaningful judicial process before it is authorized. We should carefully scrutinize any surveillance that threatens our intellectual privacy. Fourth, we must recognize that surveillance is harmful . Surveillance menaces intellectual privacy and increases the risk of blackmail, coercion, and discrimination; accordingly, we must recognize surveillance as a harm in constitutional standing doctrine. Explaining the harms of surveillance in a doctrinally sensitive way is essential if we want to avoid sacrificing our vital civil liberties.

I develop this argument in four steps. In Part I, I show the scope of the problem of modern “surveillance societies,” in which individuals are increasingly monitored by an overlapping and entangled assemblage of government and corporate watchers. I then develop an account of why this kind of watching is problematic. Part II shows how surveillance menaces our intellectual privacy and threatens the development of individual beliefs in ways that are inconsistent with the basic commitments of democratic societies. Part III explores how surveillance distorts the power relationships between the watcher and the watched, enhancing the watcher’s ability to blackmail, coerce, and discriminate against the people under its scrutiny. Part IV explores the four principles that I argue should guide the development of surveillance law, to protect us from the substantial harms of surveillance.

May 20, 2013

More from this Issue

The eu-u.s. privacy collision: a turn to institutions and procedures.

  • Paul M. Schwartz

Toward a Positive Theory of Privacy Law

  • Lior Jacob Strahilevitz

Does the Past Matter? On the Origins of Human Rights

An analysis of competing histories of the origins of international human rights law

  • Philip Alston

Police surveillance and facial recognition: Why data privacy is imperative for communities of color

Subscribe to the center for technology innovation newsletter, nicol turner lee and nicol turner lee senior fellow - governance studies , director - center for technology innovation caitlin chin-rothmann caitlin chin-rothmann fellow - center for strategic and international studies, former research analyst - the brookings institution.

Tuesday April 12, 2022

  • 70 min read

This paper was originally presented at the American Bar Association’s Antitrust Spring Meeting on April 8, 2022, in Washington, D.C.

Introduction

Governments and private companies have a long history of collecting data from civilians, often justifying the resulting loss of privacy in the name of national security, economic stability, or other societal benefits. But it is important to note that these trade-offs do not affect all individuals equally. In fact, surveillance and data collection have disproportionately affected communities of color under both past and current circumstances and political regimes.

From the historical surveillance of civil rights leaders by the Federal Bureau of Investigation (FBI) to the current misuse of facial recognition technologies, surveillance patterns often reflect existing societal biases and build upon harmful and virtuous cycles. Facial recognition and other surveillance technologies also enable more precise discrimination, especially as law enforcement agencies continue to make misinformed, predictive decisions around arrest and detainment that disproportionately impact marginalized populations.

In this paper, we present the case for stronger federal privacy protections with proscriptive guardrails for the public and private sectors to mitigate the high risks that are associated with the development and procurement of surveillance technologies. We also discuss the role of federal agencies in addressing the purposes and uses of facial recognition and other monitoring tools under their jurisdiction, as well as increased training for state and local law enforcement agencies to prevent the unfair or inaccurate profiling of people of color. We conclude the paper with a series of proposals that lean either toward clear restrictions on the use of surveillance technologies in certain contexts, or greater accountability and oversight mechanisms, including audits, policy interventions, and more inclusive technical designs.

The history of race and surveillance in the United States

The oversurveillance of communities of color dates back decades to the civil rights movement and beyond. During the 1950s and 1960s, the FBI tracked Martin Luther King, Jr., Malcolm X, and other civil rights activists through its Racial Matters and COINTELPRO programs, without clear guardrails to prevent the agency from collecting intimate details about home life and relationships that were unrelated to law enforcement. 1 More recently, the Black Lives Matter (BLM) movement, initially sparked in 2013 after the murder of 17-year-old Trayvon Martin by a local vigilante, has highlighted racial biases in policing that disproportionately lead to unwarranted deaths, improper arrests, and the excessive use of force against Black individuals. 2 Over the years, the government’s response to public protests over egregious policing patterns has raised various concerns over the appropriate use of surveillance, especially when primarily focused on communities of color. In 2015, the Baltimore Police Department reportedly used aerial surveillance, location tracking, and facial recognition to identify individuals who publicly protested the death of Freddie Gray. 3 Similarly, after George Floyd was murdered in 2020, the U.S. Department of Homeland Security (DHS) deployed drones and helicopters to survey the subsequent protests in at least 15 cities. 4

But African Americans are not the only population that has been subjected to overt tracking and profiling. The consequences of mass government surveillance were evident in programs like the China Initiative, which the Department of Justice (DOJ) launched in 2018 to prevent espionage and intellectual property theft and formally ceased in February 2022. 5 Although the China Initiative aimed to address national security threats from the Chinese government, it manufactured wider distrust and racial profiling of Chinese American academics, including those who were U.S. citizens or who lacked ties with the Chinese Communist Party. It led to several false arrests, including those of Temple University professor Xi Xiaoxing, UCLA graduate student Guan Lei, University of Tennessee professor Anming Hu, and National Weather Service scientist Sherry Chen. 6 Like with other historically-disadvantaged populations, government surveillance of Asian Americans is not a new phenomenon. As an example, the U.S. government monitored the broader Japanese American community for years even prior to World War II, including by accessing private communications and bank accounts, and eventually used census data after 1941 to locate and detain 120,000 people in internment camps. 7

Demonstrating similar profiling of an entire community, the New York Police Department (NYPD) and Central Intelligence Agency (CIA) surveilled Muslim neighborhoods, restaurants, mosques, stores, and student groups for over six years after September 11, 2001, listening in on conversations, recording license plates, and taking videos. 8 Over a decade after 9/11, a 2017 Pew Research Center survey found that 18% of Muslim American respondents still experienced being “singled out by airport security.” 9 From 2015 to 2020, Freedom of Information Act (FOIA) records exposed over 75 complaints sparked by intrusive airport searches or Islamophobic comments from Transportation Security Administration (TSA) officers toward people who were perceived to be of Middle Eastern descent. 10 Both the NYPD’s “Demographic Unit” surveillance and TSA’s profiling of Muslim travelers are widely considered to be inaccurate and ineffective in preventing violent crime. 11 Moreover, Customs and Border Protection (CBP) has deployed planes, boats, and radios to track and identify people along the U.S.-Mexico border—continuing a long tradition of hostility toward immigrants, especially those from Latino communities. Immigrant-focused surveillance extends far beyond a physical border; during the Obama and Trump administrations, Immigration and Customs Enforcement (ICE) purchased surveillance technology from private companies like Palantir and Thomson Reuters and used vehicle, insurance, tax, social media, and phone records to track undocumented immigrants throughout the country. 12 As early as 1992, the Drug Enforcement Administration surveilled phone call records to over 100 countries in bulk, which, over the years, may have gathered a significant amount of information from immigrants who called home to Mexico and countries in Central or South America. 13 In these and other cases, government entities directed surveillance with the stated goals of maintaining public order, preventing cyber theft, and protecting Americans more broadly—but the indiscriminate deployment and public vigilantism have contributed to and been fueled by deep-rooted discrimination that affects communities of color in the United States. In order to stop ongoing injustice, we need greater attention to this issue and concrete steps to protect personal privacy.

How law enforcement officers use facial recognition and other surveillance technologies

Although suspicion toward communities of color has historical roots that span decades, new developments like facial recognition technologies (FRT) and machine learning algorithms have drastically enlarged the precision and scope of potential surveillance. 14 Federal, state, and local law enforcement agencies often rely upon tools developed within the private sector, and, in certain cases, can access massive amounts of data either stored on private cloud servers or hardware (e.g., smartphones or hard drives) or available in public places like social media or online forums. 15 In particular, several government agencies have purchased access to precise geolocation history from data aggregators that compile information from smartphone apps or wearable devices. In the general absence of stronger privacy protections at the federal or state levels to account for such advancements in technology, enhanced forms of surveillance used by police officers pose significant risks to civilians already targeted in the criminal justice system and further the historical biases affecting communities of color. Next, we present tangible examples of how the private and public sectors both play a critical role in amplifying the reach of law enforcement through facial recognition and other surveillance technologies.

(A) Facial recognition

Facial recognition has become a commonplace tool for law enforcement officers at both the federal and municipal levels. Out of the approximately 42 federal agencies that employ law enforcement officers, the Government Accountability Office (GAO) discovered in 2021 that about 20, or half, used facial recognition. In 2016, Georgetown Law researchers estimated that approximately one out of four state and local law enforcement agencies had access to the technology. 16

On the procurement side, Clearview AI is one of the more prominent commercial providers of FRT to law enforcement agencies. Since 2017, it has scraped billions of publicly available images from websites like YouTube and Facebook, and enables customers to upload photos of individuals and automatically match them with other images and sources in the database. 17 As of 2021, the private startup had partnered with over 3,100 federal and local law enforcement agencies to identify people outside the scope of government databases. To put this tracking in perspective, the FBI only has about 640 million photos in its databases, compared to Clearview AI’s approximately 10 billion. 18

But Clearview AI is only one of numerous private companies that U.S. government agencies partner with to collect and process personal information. 19 Another example is Vigilant Solutions, which captures image and location information of license plates from billions of cars parked outside homes, stores, and office buildings, and which had sold access to its databases to approximately 3,000 local law enforcement agencies as of 2016. 20 Vigilant also markets various facial recognition products like FaceSearch to federal, state, and local law enforcement agencies; its customer base includes the DOJ and DHS, among others. 21 A third company, ODIN Intelligence, partners with police departments and local government agencies to maintain a database of individuals experiencing homelessness, using facial recognition to identify them and search for sensitive personal information such as age, arrest history, temporary housing history, and known associates. 22

In response to privacy and ethical concerns, and after the protests over George Floyd’s murder in 2020, some technology companies, including Amazon, Microsoft, and IBM, pledged to either temporarily or permanently stop selling facial recognition technologies to law enforcement agencies. 23 But voluntary and highly selective corporate moratoriums are insufficient to protect privacy, since they do not stop government agencies from procuring facial recognition software from other private companies. Moreover, a number of prominent companies have noticeably not taken this pledge or continue to either enable or allow scaping of their photos for third-party use in facial recognition databases. Furthermore, government agencies can still access industry-held data with varying degrees of due process—for example, although they would require a warrant with probable cause to compel precise geolocation data from first-party service providers in many cases, they might be able to access a person’s movement history without probable cause through other means, including by purchasing it from a data broker. 24

(B) Data aggregators and private sector information

The enormous scale of information that the private sector collects can feed into broader law enforcement efforts, since federal, state, and local government agencies have multiple channels by which to access corporate data. From January to June 2020 alone, federal, state, and local law enforcement agencies issued over 112,000 legal requests for data to Apple, Google, Facebook, and Microsoft—three times the number of requests than they submitted five years prior—of which approximately 85% were accommodated, including some subpoenas or court orders that did not require probable cause. 25 In 2020, reports surfaced that federal law enforcement agencies like the FBI, ICE, CBP, Drug Enforcement Agency, and the U.S. Special Operations Command purchased smartphone app geolocation data—without a warrant or binding court order—from analytics companies like Venntel, X-Mode, and Babel Street. 26  ICE and CBP used this data to enable potential deportations or arrests, which demonstrates how geolocation can have singular consequences for immigrant communities, especially among populations of color. 27

Although geolocation tracking is almost ubiquitous among smartphone apps, it also poses unique potential for harm—both since it enables the physical pursuit of an individual and because it allows entities to deduce extraneous details like sexual orientation, religion, health, or personal relationships from their whereabouts.

Law enforcement has also worked with commercial data aggregators to scan social media websites for photos and posts. In 2018, ICE used photos and status updates posted on Facebook to locate and arrest an immigrant using the pseudonym “Sid” in California—only one of thousands of individuals whom the agency reportedly tracks at any given point, aided by private data miners such as Giant Oak and Palantir. 28 On a local level, the Los Angeles Police Department reportedly pilot tested ABTShield, an algorithm developed by a Polish company, to scan millions of tweets from October to November 2020 for terms that included “protest,” “solidarity,” and “lives matter,” despite concerns that such bulk surveillance could pose privacy harms to BLM activists without presenting a clear benefit to public safety. 29

(C) Public-oriented and civilian surveillance

Technological advances have expanded government surveillance in traditionally “public” places, prompting legal questions over the boundaries between permissible or non-permissible data collection. For instance, the Electronic Frontier Foundation and University of Nevada estimate that over 1,000 local police departments fly drones over their communities. 30 The Chula Vista Police Department had dispatched drones for over 5,000 civilian calls as of March 2021, capturing images of individuals within public areas like sidewalks and parking lots. 31 Body-worn cameras, another common police resource, can function as an accountability safeguard in part as a response to BLM activism but also pose privacy concerns—particularly when videos of civilians in sensitive scenarios are retained for lengthy periods, used for facial recognition purposes, or even publicly posted online, or when bystanders in public areas are incidentally caught on camera. 32 Lastly, the everyday use of store-bought devices or apps by residents complicates the curtailment of excessive surveillance. Private sector apps, such as Neighbors (an Amazon subsidiary, and integrated with Amazon’s Ring video doorbell), NextDoor, and Citizen allow people to livestream, watch, and exchange opinions about potential crimes with other users in real-time, generating concerns over unconscious bias and privacy. 33 Surveillance cameras are becoming increasingly prevalent within private homes, restaurants, entertainment venues, and stores; hundreds of millions are estimated to operate smart security devices worldwide, some of which—such as Google Nest’s Doorbell and the Arlo Essential Wired Video Doorbell—include built-in facial recognition capabilities. 34 Simultaneously, Amazon’s Ring has partnered with almost 2,000 local law enforcement agencies to facilitate a process for officers to ask Ring users to voluntarily turn over their video recordings without the explicit use of a warrant. 35

Facial recognition is perhaps the most daunting of them all

Mass surveillance affects all Americans through a wide suite of technologies—but facial recognition, which has become one of the most critical and commonly-used technologies, poses special risks of disparate impact for historically marginalized communities. In December 2020, the New York Times reported that Nijeer Parks, Robert Williams, and Michael Oliver—all Black men—were wrongfully arrested due to erroneous matches by facial recognition programs. 36 Recent studies demonstrate that these technical inaccuracies are systemic: in February 2018, MIT and then-Microsoft researchers Joy Buolamwini and Timnit Gebru published an analysis of three commercial algorithms developed by Microsoft, Face++, and IBM, finding that images of women with darker skin had misclassification rates of 20.8%-34.7%, compared to error rates of 0.0%-0.8% for men with lighter skin. 37 Buolamwini and Gebru also discovered bias in training datasets: 53.6%, 79.6%, and 86.2% of the images in the Adience, IJB-A, and PBB datasets respectively contained lighter-skinned individuals. In December 2019, the National Institute of Standards and Technology (NIST) published a study of 189 commercial facial recognition programs, finding that algorithms developed in the United States were significantly more likely to return false positives or negatives for Black, Asian, and Native American individuals compared to white individuals. 38 When disparate accuracy rates in facial recognition technology intersect with the effects of bias in certain policing practices, Black and other people of color are at greater risk of misidentification for a crime that they have no affiliation with.

Some companies have publicly announced unilateral actions to improve the accuracy of their facial recognition algorithms and diversity of their training datasets—but the scope and effectiveness of such efforts fluctuate across the enormous quantity and breadth of facial recognition vendors. 39 The question of accuracy is magnified when factoring in the general lack of transparency across the industry; companies are not legally required to allow third-party audits of their algorithms, and many either do not or selectively publish their processes and results. For example, Amazon chose not to submit its Rekognition algorithm for testing in NIST’s 2018 report—even though, at the time, it was still licensing the algorithm for use by law enforcement agencies and in other highly-sensitive contexts. 40 Clearview AI has not publicly disclosed its rates of false positives or negatives, and similarly has not voluntarily submitted its algorithm for testing by NIST or another third party. 41

Related Content

Cameron F. Kerry

January 11, 2021

January 7, 2019

Sarah Kreps

November 29, 2021

Adding to the problem of errors in private sector facial recognition software, law enforcement databases are generally established with faulty data collection practices. Since historically biased policing patterns have contributed to their higher rates of interrogation and arrest, communities of color are often overrepresented in law enforcement databases compared to the overall U.S. population. 42 The National Association for the Advancement of Colored People (NAACP) reports that Black individuals are five times more likely than white individuals to be stopped by police officers in the United States, and that Black and Latino individuals comprise 56% of the U.S. incarcerated population but only 32% of the overall U.S. population. 43 This means that not only are police officers more likely to employ surveillance or facial recognition programs to compare images of Black and Latino individuals, but that mugshot images or arrest records of Black and Latino individuals are more likely to be stored in these databases in the first place—two distinct problems that, when aligned, will exacerbate existing patterns of racial inequity in policing. 44

Apart from the dual challenges of accuracy and transparency, there remains an ethical question of if or when it is appropriate to use facial recognition to address legitimate security concerns, regardless of its accuracy. Even if facial recognition hypothetically could improve to a point where the technology itself has near-perfect accuracy rates across all demographic groups, it would still be possible for law enforcement officers to apply it in ways that replicate existing racial disparities in their outcomes. When the European Parliament voted in favor of a non-binding resolution last October to prevent the mass police use of facial recognition in public places within the European Union (EU), it acknowledged this dilemma: “AI applications may offer great opportunities in the field of law enforcement…thereby contributing to the safety and security of EU citizens, while at the same time they may entail significant risks for the fundamental rights of people.” 45 Even if not fully banned from use in criminal justice, the institution of guardrails is a positive step toward more equitable use of enhanced surveillance technologies, including facial recognition. Any guardrails will need to consider the contexts in which technology is appropriate, such as with the European Commission’s draft Artificial Intelligence Act that would restrict law enforcement’s use of “real-time” facial recognition surveillance in public places to more “serious” situations like threats to physical safety, missing victims, or certain “criminal” offenses, and would direct law enforcement officers to take into account the nature and potential consequences of the crime before using facial recognition within the EU. 46  Weighing the need for both privacy and public safety, we now examine the existing legal guardrails that govern surveillance in law enforcement—and where gaps in privacy protections still remain.

The application of existing privacy and surveillance safeguards in the context of law enforcement

The U.S. government has long acknowledged that surveillance cannot be unlimited. There must be some safeguards to prevent any privacy abuses by the government or private entities, as a matter of fundamental rights. To that end, federal, state, and local governments have enshrined privacy values into law—in certain contexts—through layers of constitutional principles, limited statutes, and court cases. However, new technology significantly shifts the traditional balance between surveillance and civil liberties, and the existing patchwork of laws may not be enough to prevent the risks stemming from facial recognition and other technologies. 47 As such, it is necessary to take stock of existing privacy safeguards and identify areas of improvement. Samuel Warren and Louis Brandeis described this phenomenon in their famous 1890 Harvard Law Review article: “That the individual shall have full protection in person and in property is a principle as old as the common law; but it has been found necessary from time to time to define anew the exact nature and extent of such protection.” 48

(A) How the law addresses government surveillance

In the United States, privacy principles can trace their roots to the Constitution. 49 Although the Fourth Amendment prevents the government from conducting “unreasonable” searches without probable cause to obtain a warrant, law enforcement officers can still collect data through other means, such as by purchasing personal information from data brokers or collecting data in public places where people do not possess a “reasonable expectation of privacy.” 50 Yet, even the Supreme Court has acknowledged, in certain cases, that the amplifying effect of technology in surveillance may require an examination of Fourth Amendment limitations in public places. 51 Although police officers can physically search people’s vehicles subject to an arrest, the Court ruled in Riley v. California (2014) that they cannot search a person’s smartphone without a warrant—acknowledging that smartphones are “a pervasive and insistent part of daily life … unheard of ten years ago” and the modern scope of data collection “calls for a new balancing of law enforcement and privacy interests.” 52 Citing Riley, the Court held in Carpenter v. United States (2018) that the government would also require a warrant to compel cell phone service providers to turn over geolocation records, arguing that “seismic shifts in digital technology that made possible the tracking of not only Carpenter’s location but also everyone else’s.” 53 Despite the majority opinions in Riley and Carpenter, there are limitations to the Supreme Court’s ability to preserve privacy principles through judicial interpretation alone. In his dissent in Carpenter, then-Justice Anthony Kennedy wrote that the government’s access of cell phone location records does not constitute a search under the Fourth Amendment, and individuals do not have a reasonable expectation of privacy in records controlled by a cell phone company. In another case, Florida v. Riley (1989), the Supreme Court held that police officers could fly a helicopter 400 feet above a greenhouse without a search warrant—even if the interior of the building would not be visible without aerial surveillance—and that people do not have a reasonable expectation of privacy if other helicopters could legally fly at that height and observe the activity from a public airspace. 54 While the Supreme Court has heard several major cases on geolocation technologies, there is still legal and social uncertainty around surveillance technologies like facial recognition and drones, where judicial history is extremely limited, especially at the highest court. 55 One of the earliest court cases on facial recognition occurred in Lynch v. State (2018), when the First District Court of Appeal in Florida decided that a Black man named Willie Allen Lynch, who was identified by police through a facial recognition program, was not legally entitled to view the other four erroneous matches that the program returned. 56 The Michigan Court of Appeals recently decided one of the few cases related to drones, Long Lake Township v. Todd Maxon (2021), where it reversed a lower court’s decision to rule that the government would require a warrant to surveil an individual’s property with a drone. 57 In short, the judicial branch alone cannot manufacture privacy expectations—courts interpret existing law based on the Constitution, statutes, and regulations, but their interpretations depend on the judges or justices that sit on the bench, and it falls on Congress to resolve uncertainties.

In 1986, Congress enacted the Electronic Communications Privacy Act (ECPA), bundling the Wiretap Act and Stored Communications Act, to protect Americans against government privacy intrusions in their electronic communications (e.g., stored emails or live telephone conversations). However, the ECPA contains provisions that allow law enforcement to access emails and customer records without a warrant in certain contexts. 58 For example, law enforcement would require a warrant to access an unopened email that has been remotely stored for under 180 days—but after 180 days, it would be able to access that same email with only a subpoena. It can also issue a subpoena to compel companies to turn over non-content user records such as name, address, and payment information. Apart from the ECPA, Executive Order 12333 and Section 702 of the Foreign Intelligence Surveillance Act allow the federal government to gather “incidental collection” of communications content from U.S. residents who contact people located outside the United States without a warrant, contrary to Fourth Amendment protections. 59 Together, these statutes and EO grant the U.S. government broad authority to access the electronic communications of Americans, tapping into the massive troves of data that private communications companies store.

Although facial recognition meets few enacted legal restrictions at the federal level, over seven states and 20 municipalities, such as Boston, San Francisco, and Virginia, have established some limitations on government use of facial recognition usage in certain contexts. 60 For instance, Maine enacted a law in 2021 that generally prohibits government use of facial recognition except in certain cases (e.g., “serious” crimes, identification of missing or deceased individuals, and fraud prevention). 61 The same year, Minneapolis passed an ordinance to prevent the government from procuring facial recognition technology from third parties (e.g., Clearview AI) or knowingly using information collected through facial recognition, citing the technology’s higher misidentification rates for communities of color and the disproportionate burden of policing that communities of color face. 62  Yet, state and local regulations lack uniformity throughout the country, and the majority of municipalities do not have specific legal restrictions on government use of facial recognition.

(B) Protections from private companies

As we describe earlier, the private sector is integral to law enforcement operations; companies like Clearview AI often test and develop the facial recognition tools that are available to law enforcement or amass large databases that the government may have access to. Yet, in the absence of a nationwide comprehensive data privacy law, many companies face few legal limitations on how they collect, process, and transfer personal information—allowing Clearview and other companies to gather data from millions of people without clear controls to access or delete their images, and with few safeguards for security, algorithmic bias, and transparency. 63 The Federal Trade Commission (FTC) primarily investigates and enforces data protection on a national level, relying on its authority under Section 5 of the FTC Act to act against entities that engage in “unfair or deceptive acts or practices.” Using this authority, the FTC has entered consent agreements with companies like Sears (2009), Facebook (2011), Snapchat (2014), and Nomi Technologies (2015) for misrepresenting their privacy policies to their users. 64 However, this statute largely emphasizes user transparency, which has led to a system of “notice and choice,” where companies display a lengthy privacy policy and require users to consent to it before accessing their service. Notice-and-choice does not effectively preserve privacy; companies like Clearview or Amazon’s Ring can still set their own privacy policies—choosing what data they collect, store, and share, and for how long—and with the FTC’s more limited authority, the agency has only brought approximately 80 data privacy cases since 2002. 65 Privacy regulations are disjointed at the state level, and only California, Colorado, and Virginia have so far enacted comprehensive data privacy laws that give residents the rights to access and delete personal information that many companies store. In addition, five states—Arkansas, California, Illinois, Texas, and Washington—have adopted laws that regulate how private companies treat biometric information, including facial recognition. 66 Companies have treated compliance with diverging state privacy laws in two primary ways: some, like Microsoft, have pledged to voluntarily offer single-state protections (e.g., the right to access personal information) nationwide, while others, such as Clearview AI, offer different privacy settings depending on where a person lives. 67 Clearview’s website currently only allows California residents to access and delete their personal information, while Illinois residents may choose to opt out of search results. 68 Residents of the other 48 states do not experience these same privacy protections; they may submit a request for Clearview to remove search results associated with URLs that were already deleted from other websites but may not delete photos or opt out of search results for links that are still available elsewhere on the internet. Since Clearview does not advertise these controls, however, it is unclear how many individuals are aware of them or have submitted a data request.

Despite its limited privacy controls, Clearview—along with many other facial recognition companies—does not ask individuals for permission to scrape their images from public places (e.g., CCTV surveillance cameras, social media platforms, other websites). This problem is widespread; a 2020 GAO report describes a study of 30 datasets used to train facial recognition algorithms since 2006, which revealed that approximately 24 million photos had been scraped from websites without obtaining consent from the one million individuals photographed. 69

In the end, it is virtually impossible for an individual to fully opt out of facial recognition identification or control the use of their images without abstaining from public areas, the internet, or society altogether.

Since voluntary privacy protections do not apply across the entire industry—some companies offer privacy settings, while others do not—government intervention is necessary to set privacy protections for all U.S. residents, especially those communities most vulnerable to the harmful effects of surveillance.

Proposals to prevent privacy risks of facial recognition and other technologies

As both the government and private corporations feed into the problem of surveillance, gaps in current federal and state privacy laws mean that their actions to collect, use, or share data often go unchallenged. In other words, existing laws do not adequately protect user privacy among the rising ubiquity of facial recognition and other emerging technologies, fundamentally omitting the needs of communities of color that disproportionately bear the consequences of surveillance. To reduce the potential for emerging technologies to replicate historical biases in law enforcement, we summarize recent proposals that address racial bias and unequal applications of technology in the public sector. We also explain why U.S. federal privacy legislation is necessary to govern how private sector companies implement fairness in the technical development process, limit their data collection and third-party sharing, and grant more agency to the individuals they surveil.

(A) Direct measures for federal, state, and local law enforcement agencies

Although the executive branch is taking some steps to evaluate its use of artificial intelligence and equitable distribution of public services, it lacks heightened federal government-wide scrutiny over its facial recognition programs and relationships with geolocation data brokers. In October 2021, the White House announced plans to develop an AI Bill of Rights to assert basic principles of civil liberties in technology, referencing the role that facial recognition plays in discriminatory arrests as well as the privacy concerns stemming from data collection. 70 In January 2021, the Biden administration issued an executive order that directed federal agencies to conduct equity assessments to review any obstacles that marginalized communities, including individuals of color, encounter to access government services and resources. 71 These are important steps, but the role of equity assessments should be extended to appraise the appropriateness of facial recognition, access to geolocation information from data brokers, and related privacy or civil rights implications for marginalized communities for the approximately 42 federal agencies that employ law enforcement officers in some function. Short of White House guidance, federal agency review of facial recognition technologies might remain more piecemeal; for example, the Internal Revenue Service announced in early February 2022 that it would stop using the facial recognition tool ID.me for citizen verification following public outcry, but it is unclear whether other federal agencies that use the software—such as the United States Patent and Trademark Office and Social Security Administration—will choose to do so as well. 72 Federal law enforcement reform could also occur through an act of Congress, and legislators have introduced several bills that also propose new guardrails for executive agencies that conduct surveillance. In March 2021, the House of Representatives passed the George Floyd Justice in Policing Act which, among other provisions, would prohibit federal law enforcement officers from deploying facial recognition in their body cameras or patrol vehicle cameras. 73 The Facial Recognition and Biometric Technology Moratorium Act, which Sen. Ed Markey (D-Mass.) and Rep. Pramila Jayapal (D-Wash.) introduced in June 2021, aims to ban the federal government’s use of biometric surveillance systems unless otherwise authorized by law. 74 The Facial Recognition Technology Warrant Act, which Sens. Chris Coons (D-Del.) and Mike Lee (R-Utah) proposed in 2019 during the previous Congress, included a warrant requirement for federal law enforcement officers to conduct “ongoing” surveillance of an individual in public areas with facial recognition for over 72 hours. 75 In April 2021, Rep. Jerrold Nadler (D-N.Y.) and Sen. Ron Wyden (D-Ore.) introduced The Fourth Amendment Is Not For Sale Act to mitigate federal law enforcement’s access to information from “electronic communication services” or “remote computing services” in a way that violates privacy policy agreements or is otherwise deceptive, primarily targeting concerns over the government’s purchase of geolocation information from data brokers like Venntel or X-Mode without a warrant. 76 These proposed bills outline some of the existing problems with surveillance oversight: a lack of guardrails and transparency to prevent law enforcement’s abuse of facial recognition and access to geolocation and communications data. Yet, they are not complete fixes. If enacted into law, the Fourth Amendment Is Not For Sale Act could prevent any attempts by law enforcement agencies to bypass due process or a probable cause warrant by purchasing communications or location data from private companies—but such a moratorium would be largely conditional on a website’s terms of service or privacy policies. 77 Similarly, the George Floyd Justice in Policing Act, Facial Recognition Technology Warrant Act, and Facial Recognition Biometric Technology Moratorium Act could address federal law enforcement agencies’ use of facial recognition, but would not affect state and local police officers’ use of the technology. 78

Because state and local governments have jurisdiction over policing in their areas, Congress and the federal executive branch have limited means to improve policing practices everywhere in the United States. 79 Still, as privacy concerns over facial recognition and surveillance grow, more state and local governments and police departments can individually consider measures to specify the contexts in which it is appropriate to use facial recognition and the necessary processes to do so (e.g., with a probable cause warrant). 80 In 2016, Georgetown Law researchers Clare Garvie, Alvaro Bedoya, and Jonathan Frankle proposed one possible framework for “acceptable uses of facial recognition” for law enforcement; for example, an individual with special training in facial recognition would be permitted to use the software to identify somebody on surveillance camera footage if officers have a “reasonable suspicion” that they committed a felony. 81 In addition to how to use the technology, such training would promote awareness of the “limitations of facial recognition” and the “appropriateness [of images] for face recognition searches.” 82 Ideally, this should also include an educational foundation in racial bias and ethics of surveillance for law enforcement officers at the federal, state, and local levels. Brookings researcher Rashawn Ray has also supported training opportunities for state and local law enforcement as part of a holistic approach to increase accountability around racial profiling. Ray recently testified on this issue before the Virginia Advisory Committee to the U.S. Commission on Civil Rights, describing how police departments can host implicit bias and mental health trainings for officers, invite community members to sit on police oversight or misconduct trial boards, and provide housing stipends to help officers reside in their local communities. 83 Georgetown Law professor Laura Moy has also put forward a comprehensive list of questions that police departments might use to assess their use of surveillance technology, modeled after the racial equity impact assessments used by the Minneapolis Board of Education and others. 84 The proposals by Garvie, Bedoya, Frankle, Ray, and Moy are a valuable starting point for federal, state, and local law enforcement agencies to consider in application—and moreover, they demonstrate a need for police departments to actively work with civil society, academic researchers, and advocacy groups to provide input on prioritizing racial equity in police technology.

(B) The role of federal privacy legislation

Although Congress does not oversee state and local police departments, there is one clear-cut action it could take that would have an indirect—yet significant—impact on government surveillance across the nation: to pass a comprehensive federal privacy law that regulates the data practices of private companies. Government agencies often purchase or license facial recognition software from private companies, and businesses can either voluntarily share or be legally compelled to disclose large amounts of personal information to law enforcement. 85 Despite the general lack of comprehensive privacy regulations in the United States, the U.S. private sector provides unprecedented resources that immensely enhance the surveillance capabilities of law enforcement agencies. 86 Should Congress pass a federal privacy law to govern how private companies collect and use data, the effects would not only increase privacy protections for all Americans but reduce the possibility of surveillance abuse against communities of color in the law enforcement context. First, Congress could introduce a requirement for businesses to allow individuals to access and delete personal information that they hold—allowing anybody to become aware of and erase their images in facial recognition databases like Clearview, and meaningfully increasing the transparency of data collection. 87 Next, Congress could enshrine common sense limitations in data collection, storage, and retention for private companies into law—this, in turn, would limit the amount of data that law enforcement agencies could access either voluntarily or through subpoenas or warrants. It should establish baseline principles like data minimization—only allowing private companies to collect, use, and share data in ways that are necessary to the original business purpose—to reduce extraneous data collection and potential for surveillance. These principles are not inconceivable in practice: residents of California, Virginia, Colorado, and the European Union already possess similar protections, and pending legislation such as Sen. Maria Cantwell’s (D-Wash.) Consumer Online Privacy Rights Act and Sen. Roger Wicker’s (R-Miss.) SAFE DATA Act have been introduced to accord these provisions to all Americans. 88

But Congress needs to go further than general privacy provisions and embody additional measures to address facial recognition and biometric information, given their outsized potential to result in disparate impact in the law enforcement context. Federal privacy legislation could also advance this objective; Congress could direct the Federal Trade Commission to study the impact of biometric information, including algorithmic outcomes, on civil rights in highly sensitive scenarios such as law enforcement. Current federal privacy bills or proposals take different approaches to biometric information—some, such as Sen. Sherrod Brown’s (D-Ohio) draft Data Accountability and Transparency Act of 2021, would ban “data aggregators” from using facial recognition technology altogether, while on the other end of the spectrum, Wicker’s SAFE DATA Act would simply require companies to obtain consent from individuals before processing or sharing biometric information with third parties. 89 Likely, some solution would be necessary in the middle: clear guardrails on how private companies collect, process, and transfer biometric information in a manner that would allow them to use and improve the technology in appropriate contexts while also preventing misuse. Congress could direct the FTC to create these regulations, based on the findings of their study and input from civil society.

Legislation can require businesses that use personal information to develop or deploy algorithms to audit both their products and outcomes to prevent disparate impact. A number of researchers, such as Dillon Reisman, Jason Schultz, Kate Crawford, and Meredith Whittaker of New York University’s AI Now Institute have conceptualized “algorithmic impact assessments” to help government agencies or companies to evaluate the accuracy, potential community harms or benefits, and risk of bias or discrimination before deploying automated tools. 90 Bills like the Algorithmic Accountability Act, which Rep. Yvette Clarke (D-N.Y.) and Sen. Ron Wyden (D-Ore.) reintroduced in February 2022, would also require companies that deploy AI for critical decisions to document the representativeness of their input datasets, sources of data collection, any alternatives or considerations to the input data, and overall methodology. 91 In any framework to evaluate the use of facial recognition or other surveillance tools, impact assessments will be critical to help users and developers audit algorithms for accuracy and racial equity both in development and in the context of application. More importantly, the private sector cannot be the sole arbiter of truth when it comes to the performance of these systems; law enforcement must evaluate products and services to anticipate potential privacy risks and actively examine the inclusivity of datasets and potential risks of replicating patterns of marginalization.

From this review, it is clear that facial recognition and surveillance technologies have shifted the balance of power toward law enforcement agencies. That is why privacy protections are more important than ever for all Americans—and they are especially so for the communities of color that may suffer the greatest consequences from their absence.

The authors would like to thank Samantha Lai for editing assistance, Emily Skahill for research support, and Cameron Kerry and Darrell West for feedback and comments.

The Brookings Institution is a nonprofit organization devoted to independent research and policy solutions. Its mission is to conduct high-quality, independent research and, based on that research, to provide innovative, practical recommendations for policymakers and the public. The conclusions and recommendations of any Brookings publication are solely those of its author(s), and do not reflect the views of the Institution, its management, or its other scholars.

Amazon, Apple, Facebook, Google, IBM, and Microsoft provide general, unrestricted support to the Institution. The findings, interpretations, and conclusions in this report are not influenced by any donation. Brookings recognizes that the value it provides is in its absolute commitment to quality, independence, and impact. Activities supported by its donors reflect this commitment.

Related Books

Gloria González Fuster, Valsamis Mitsilegas, Elspeth Guild, Sergio Carrera

July 19, 2016

Jacob Parakilas, Hannah Bryce, Kenneth Cukier, Missy Cummings

August 28, 2018

  • “Federal Bureau of Investigation (FBI),” Stanford University, The Martin Luther King, Jr. Research and Education Institute, accessed February 24, 2022, https://kinginstitute.stanford.edu/encyclopedia/federal-bureau-investigation-fbi ; Alvaro M. Bedoya, “What the FBI’s Surveillance of Martin Luther King Tells Us About the Modern Spy Era,” Slate Magazine, January 18, 2016, https://slate.com/technology/2016/01/what-the-fbis-surveillance-of-martin-luther-king-says-about-modern-spying.html ; Virgie Hoban, “‘Discredit, Disrupt, and Destroy’: FBI Records Acquired by the Library Reveal Violent Surveillance of Black Leaders, Civil Rights Organizations,” University of California, Berkeley Library News, accessed February 24, 2022, https://news.lib.berkeley.edu/fbi ; Sam Briger, “Documentary Exposes How The FBI Tried To Destroy MLK With Wiretaps, Blackmail,” NPR, January 18, 2021, https://www.npr.org/2021/01/18/956741992/documentary-exposes-how-the-fbi-tried-to-destroy-mlk-with-wiretaps-blackmail . ( Back to top )
  • George Joseph, “Exclusive: Feds Regularly Monitored Black Lives Matter Since Ferguson,” The Intercept, July 24, 2015, https://theintercept.com/2015/07/24/documents-show-department-homeland-security-monitoring-black-lives-matter-since-ferguson/ . ( Back to top )
  • Kevin Rector and Alison Knezevich, “Maryland’s Use of Facial Recognition Software Questioned by Researchers, Civil Liberties Advocates,” The Baltimore Sun, October 18, 2016, https://www.baltimoresun.com/news/crime/bs-md-facial-recognition-20161017-story.html ; Shira Ovide, “A Case for Banning Facial Recognition,” The New York Times, June 9, 2020, https://www.nytimes.com/2020/06/09/technology/facial-recognition-software.html . ( Back to top )
  • Zolan Kanno-Youngs, “U.S. Watched George Floyd Protests in 15 Cities Using Aerial Surveillance,” The New York Times, June 19, 2020, https://www.nytimes.com/2020/06/19/us/politics/george-floyd-protests-surveillance.html . ( Back to top )
  • “Information About the Department of Justice’s China Initiative and a Compilation of China-Related Prosecutions Since 2018,” U.S. Department of Justice, July 31, 2020, https://www.justice.gov/archives/nsd/information-about-department-justice-s-china-initiative-and-compilation-china-related ; Ryan Lucas, “The Justice Department is ending its controversial China Initiative,” NPR, February 23, 2022, https://www.npr.org/2022/02/23/1082593735/justice-department-china-initiative . ( Back to top )
  • Michael German and Alex Liang, “Why Ending the Justice Department’s ‘China Initiative’ Is Vital to U.S. Security,” Just Security, January 3, 2022, https://www.justsecurity.org/79698/why-ending-the-justice-departments-china-initiative-is-vital-to-u-s-security/ ; Matt Apuzzo, “U.S. Drops Charges That Professor Shared Technology With China,” The New York Times, September 11, 2015, https://www.nytimes.com/2015/09/12/us/politics/us-drops-charges-that-professor-shared-technology-with-china.html ; Don Lee, “Why Trump’s Anti-Spy ‘China Initiative’ Is Unraveling,” Los Angeles Times, September 16, 2021, https://www.latimes.com/politics/story/2021-09-16/why-trump-china-initiative-unraveling ; Emma Coffey, “University Offers to Reinstate Professor Acquitted of Espionage Charges,” University of Texas, The Daily Beacon, October 29, 2021, https://www.utdailybeacon.com/campus_news/academics/university-offers-to-reinstate-professor-acquitted-of-espionage-charges/article_f6d0aabe-38ee-11ec-9c23-57a37bddf43c.html ; Nicole Perlroth, “Accused of Spying for China, Until She Wasn’t,” The New York Times, May 9, 2015, https://www.nytimes.com/2015/05/10/business/accused-of-spying-for-china-until-she-wasnt.html . ( Back to top )
  • Nina Wallace, “Of Spies and G-Men: How the U.S. Government Turned Japanese Americans into Enemies of the State,” Densho: Japanese American Incarceration and Japanese Internment, September 29, 2017, https://densho.org/catalyst/of-spies-and-gmen/ ; Pedro A. Loureiro, “Japanese Espionage and American Countermeasures in Pre—Pearl Harbor California,” The Journal of American-East Asian Relations 3, no. 3 (1994): 197–210, https://www.jstor.org/stable/23612532 ; “Statement – The Japanese American Citizens League,” American Civil Liberties Union, accessed February 24, 2022, https://www.aclu.org/other/statement-japanese-american-citizens-league ; Lori Aratani, “Secret Use of Census Info Helped Send Japanese Americans to Internment Camps in WWII,” The Washington Post, April 3, 2018, https://www.washingtonpost.com/news/retropolis/wp/2018/04/03/secret-use-of-census-info-helped-send-japanese-americans-to-internment-camps-in-wwii/ . ( Back to top )
  • Alvaro M. Bedoya, “What the FBI’s Surveillance of Martin Luther King Tells Us About the Modern Spy Era,” Slate Magazine, January 18, 2016, https://slate.com/technology/2016/01/what-the-fbis-surveillance-of-martin-luther-king-says-about-modern-spying.html ; Adam Goldman and Matt Apuzzo, “NYPD Muslim Spying Led to No Leads, Terror Cases,” The Associated Press, August 21, 2012, https://www.ap.org/ap-in-the-news/2012/nypd-muslim-spying-led-to-no-leads-terror-cases ; Adam Goldman and Matt Apuzzo, “With Cameras, Informants, NYPD Eyed Mosques,” The Associated Press, February 23, 2012, https://www.ap.org/ap-in-the-news/2012/with-cameras-informants-nypd-eyed-mosques . ( Back to top )
  • “U.S. Muslims Concerned About Their Place in Society, but Continue to Believe in the American Dream,” Pew Research Center, Religion & Public Life Project, July 26, 2017, https://www.pewforum.org/2017/07/26/findings-from-pew-research-centers-2017-survey-of-us-muslims/ . ( Back to top )
  • Tatiana Walk-Morris, “What to Do If You Face Anti-Muslim Discrimination at Airport Security,” Vice, September 10, 2021, https://www.vice.com/en/article/epnwjz/what-to-do-if-you-face-anti-muslim-discrimination-islamophobia-at-airport-security . ( Back to top )
  • Adam Goldman and Matt Apuzzo, “NYPD Muslim Spying Led to No Leads, Terror Cases,” The Associated Press, August 21, 2012, https://www.ap.org/ap-in-the-news/2012/nypd-muslim-spying-led-to-no-leads-terror-cases ; Mike Ahlers and Jeanne Meserve, “Muslim-American Group Criticizes TSA Plan as Profiling,” CNN, January 4, 2010, http://www.cnn.com/2010/CRIME/01/04/tsa.measures.muslims/index.html . ( Back to top )
  • John Davis, “Walls Work,” U.S. Customs and Border Protection, accessed February 24, 2022, https://www.cbp.gov/frontline/border-security ; McKenzie Funk, “How ICE Picks Its Targets in the Surveillance Age,” The New York Times, October 2, 2019, https://www.nytimes.com/2019/10/02/magazine/ice-surveillance-deportation.html ; Emma Li, “Mass and Intrusive Surveillance of Immigrants Is an Unacceptable Alternative to Detention,” Center for Democracy and Technology (blog), August 5, 2021, https://cdt.org/insights/mass-and-intrusive-surveillance-of-immigrants-is-an-unacceptable-alternative-to-detention/ . ( Back to top )
  • Brad Heath, “U.S. Secretly Tracked Billions of Calls for Decades,” USA TODAY, April 7, 2015, https://www.usatoday.com/story/news/2015/04/07/dea-bulk-telephone-surveillance-operation/70808616/ ; Alvaro M. Bedoya, “What the FBI’s Surveillance of Martin Luther King Tells Us About the Modern Spy Era,” Slate Magazine, January 18, 2016, https://slate.com/technology/2016/01/what-the-fbis-surveillance-of-martin-luther-king-says-about-modern-spying.html . ( Back to top )
  • Andrew Guthrie Ferguson, “Facial Recognition and the Fourth Amendment,” Minnesota Law Review 3204 (2021), https://scholarship.law.umn.edu/mlr/3204 . ( Back to top )
  • Katelyn Ringrose, “Law Enforcement’s Pairing of Facial Recognition Technology with Body-Worn Cameras Escalates Privacy Concerns,” Virginia Law Review Online 105 (2019): 57, https://www.virginialawreview.org/articles/law-enforcements-pairing-facial-recognition-technology-body-worn-cameras-escalates/ . ( Back to top )
  • “Facial Recognition Technology: Federal Law Enforcement Agencies Should Have Better Awareness of Systems Used By Employees,” U.S. Government Accountability Office, July 13, 2021, https://www.gao.gov/products/gao-21-105309 ; Clare Garvie, Alvaro Bedoya, and Jonathan Frankle, “The Perpetual Line-Up: Unregulated Police Face Recognition in America,” Georgetown Law, Center on Privacy & Technology, October 18, 2016, https://www.perpetuallineup.org/ . ( Back to top)
  • Kashmir Hill, “The Secretive Company That Might End Privacy as We Know It,” The New York Times, January 18, 2020, https://www.nytimes.com/2020/01/18/technology/clearview-privacy-facial-recognition.html . ( Back to top)
  • Eli Watkins, “Watchdog Says FBI Has Access to More than 641 Million ‘Face Photos’,” CNN, June 4, 2019, https://www.cnn.com/2019/06/04/politics/gao-fbi-face-photos/index.html ; Will Knight, “Clearview AI Has New Tools to Identify You in Photos,” Wired, October 4, 2021, https://www.wired.com/story/clearview-ai-new-tools-identify-you-photos/ . ( Back to top )
  • Max Rivlin-Nadler, “How ICE Uses Social Media to Surveil and Arrest Immigrants,” The Intercept, December 22, 2019, https://theintercept.com/2019/12/22/ice-social-media-surveillance/ . ( Back to top )
  • Conor Friedersdorf, “An Unprecedented Threat to Privacy,” The Atlantic, January 27, 2016, https://www.theatlantic.com/politics/archive/2016/01/vigilant-solutions-surveillance/427047/ . ( Back to top)
  • ”Facial Recognition Technology: Current and Planned Uses by Federal Agencies,” U.S. Government Accountability Office, August 24, 2021, https://www.gao.gov/products/gao-21-526 ; “Vigilant FaceSearch – Facial Recognition System,” Motorola Solutions, accessed February 24, 2022, https://www.motorolasolutions.com/en_us/products/command-center-software/analysis-and-investigation/vigilant-facesearch-facial-recognition-system.html . ( Back to top)
  • Joseph Cox, “​​Tech Firm Offers Cops Facial Recognition to ID Homeless People,” Vice, February 8, 2022, https://www.vice.com/en/article/wxdp7x/tech-firm-facial-recognition-homeless-people-odin . ( Back to top)
  • Jeffrey Dastin, “Amazon Extends Moratorium on Police Use of Facial Recognition Software,” Reuters, May 18, 2021, https://www.reuters.com/technology/exclusive-amazon-extends-moratorium-police-use-facial-recognition-software-2021-05-18/ . ( Back to top)
  • Sara Morrison, “Here’s How Police Can Get Your Data — Even If You Aren’t Suspected of a Crime,” Vox, July 31, 2021, https://www.vox.com/recode/22565926/police-law-enforcement-data-warrant . ( Back to top)
  • Matt O’Brien and Michael Liedtke, “How Big Tech Created a Data ‘treasure Trove’ for Police,” AP News, June 22, 2021, https://apnews.com/article/how-big-tech-created-data-treasure-trove-for-police-e8a664c7814cc6dd560ba0e0c435bf90 . ( Back to top)
  • Sara Morrison, “A Surprising Number of Government Agencies Buy Cellphone Location Data. Lawmakers Want to Know Why,” Vox, December 2, 2020, https://www.vox.com/recode/22038383/dhs-cbp-investigation-cellphone-data-brokers-venntel ; Joseph Cox, “How the U.S. Military Buys Location Data from Ordinary Apps,” Vice, November 16, 2020, https://www.vice.com/en/article/jgqm5x/us-military-location-data-xmode-locate-x . ( Back to top)
  • Jon Keegan and Alfred Ng, “There’s a Multibillion-Dollar Market for Your Phone’s Location Data,” The Markup, September 30, 2021, https://themarkup.org/privacy/2021/09/30/theres-a-multibillion-dollar-market-for-your-phones-location-data ; Byron Tau and Michelle Hackman, “Federal Agencies Use Cellphone Location Data for Immigrant Enforcement,” The Wall Street Journal, February 7, 2020, https://www.wsj.com/articles/federal-agencies-use-cellphone-location-data-for-immigration-enforcement-11581078600 . ( Back to top)
  • Max Rivlin-Nadler, “How ICE uses social media to surveil and arrest immigrants,” The Intercept, December 22, 2019, https://theintercept.com/2019/12/22/ice-social-media-surveillance/ ; “Social media surveillance by Homeland Security Investigations: A threat to immigrant communities and free expression,” Brennan Center for Justice, November 15, 2019, https://www.brennancenter.org/our-work/research-reports/social-media-surveillance-homeland-security-investigations-threat . ( Back to top)
  • Max Rivlin-Nadler, “How ICE Uses Social Media to Surveil and Arrest Immigrants,” The Intercept, December 22, 2019, https://theintercept.com/2019/12/22/ice-social-media-surveillance/ ; Mary Pat Dwyer and José Guillermo Gutiérrez, “Documents Reveal LAPD Collected Millions of Tweets from Users Nationwide,” Brennan Center for Justice, December 15, 2021, https://www.brennancenter.org/our-work/analysis-opinion/documents-reveal-lapd-collected-millions-tweets-users-nationwide . ( Back to top)
  • Matthew Guariglia, “How Are Police Using Drones?” Electronic Frontier Foundation, January 6, 2022, https://www.eff.org/deeplinks/2022/01/how-are-police-using-drones . ( Back to top)
  • Faine Greenwood, “The Chula Vista, California, Police Department’s One-of-a-Kind Drone Program,” Slate Magazine, May 17, 2021, https://slate.com/technology/2021/05/chula-vista-police-drone-program.html . ( Back to top) /li>
  • Dawn Kawamoto, “Cops Wearing Cameras: What Happens When Privacy and Accountability Collide?” GovTech, accessed February 24, 2022, https://www.govtech.com/biz/Cops-Wearing-Cameras-What-Happens-When-Privacy-and-Accountability-Collide.html ; Bryce C. Newell, “Body Cameras Help Monitor Police but Can Invade People’s Privacy,” The Conversation, May 25, 2021, http://theconversation.com/body-cameras-help-monitor-police-but-can-invade-peoples-privacy-160846 ; Jennifer Lee, “Will Body Cameras Help End Police Violence?” ACLU of Washington, June 7, 2021, https://www.aclu-wa.org/story/%C2%A0will-body-cameras-help-end-police-violence%C2%A0 ; German Lopez, “The Failure of Policy Body Cameras,” Vox, July 21, 2017, https://www.vox.com/policy-and-politics/2017/7/21/15983842/police-body-cameras-failures . ( Back to top)
  • Rani Molla, “The Rise of Fear-Based Social Media like Nextdoor, Citizen, and Now Amazon’s Neighbors,” Vox, May 7, 2019, https://www.vox.com/recode/2019/5/7/18528014/fear-social-media-nextdoor-citizen-amazon-ring-neighbors ; Jessi Hempel, “For Nextdoor, Eliminating Racism Is No Quick Fix,” Wired, February 16, 2017, https://www.wired.com/2017/02/for-nextdoor-eliminating-racism-is-no-quick-fix/ . ( Back to top)
  • Rani Molla, “Amazon Ring Sales Nearly Tripled in December despite Hacks,” Vox, January 21, 2020, https://www.vox.com/recode/2020/1/21/21070402/amazon-ring-sales-jumpshot-data ; Thorin Klosowski, “Facial Recognition Is Everywhere. Here’s What We Can Do About It,” The New York Times Wirecutter (blog), July 15, 2020, https://www.nytimes.com/wirecutter/blog/how-facial-recognition-works/ . ( Back to top)
  • Lauren Bridges, “Amazon’s Ring Is the Largest Civilian Surveillance Network the US Has Ever Seen,” The Guardian, May 18, 2021, http://www.theguardian.com/commentisfree/2021/may/18/amazon-ring-largest-civilian-surveillance-network-us ; Rani Molla, “How Amazon’s Ring Is Creating a Surveillance Network with Video Doorbells,” Vox, September 5, 2019, https://www.vox.com/2019/9/5/20849846/amazon-ring-explainer-video-doorbell-hacks . ( Back to top)
  • Kashmir Hill, “Another Arrest, and Jail Time, Due to a Bad Facial Recognition Match,” The New York Times, December 29, 2020, https://www.nytimes.com/2020/12/29/technology/facial-recognition-misidentify-jail.html . ( Back to top)
  • Joy Buolamwini and Timnit Gebru, “Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification,” Conference on fairness, accountability and transparency: PMLR, 2018, https://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf . ( Back to top)
  • “NIST Study Evaluates Effects of Race, Age, Sex on Face Recognition Software,” U.S. National Institute of Standards and Technology, December 19, 2019, https://www.nist.gov/news-events/news/2019/12/nist-study-evaluates-effects-race-age-sex-face-recognition-software ; Natasha Singer and Cade Metz, “Many Facial-Recognition Systems Are Biased, Says U.S. Study,” The New York Times, December 19, 2019, https://www.nytimes.com/2019/12/19/technology/facial-recognition-bias.html ; Drew Harwell, “Federal Study Confirms Racial Bias of Many Facial-Recognition Systems, Casts Doubt on Their Expanding Use,” The Washington Post, December 19, 2019, https://www.washingtonpost.com/technology/2019/12/19/federal-study-confirms-racial-bias-many-facial-recognition-systems-casts-doubt-their-expanding-use/ . ( Back to top)
  • “Amazon Rekognition Improves Accuracy of Real-Time Face Recognition and Verification,” Amazon Web Services, April 2, 2018, https://aws.amazon.com/about-aws/whats-new/2018/04/amazon-rekognition-improves-accuracy-of-real-time-face-recognition-and-verification/ ; Brad Smith, “Facial Recognition: It’s Time for Action,” Microsoft On the Issues, December 6, 2018, https://blogs.microsoft.com/on-the-issues/2018/12/06/facial-recognition-its-time-for-action/ . ( Back to top)
  • Jon Porter, “Federal Study of Top Facial Recognition Algorithms Finds ‘Empirical Evidence’ of Bias,” The Verge, December 20, 2019, https://www.theverge.com/2019/12/20/21031255/facial-recognition-algorithm-bias-gender-race-age-federal-nest-investigation-analysis-amazon . ( Back to top)
  • Jennifer Lynch, “Face Off: Law Enforcement Use of Face Recognition Technology,” Electronic Frontier Foundation, February 12, 2018, https://www.eff.org/wp/law-enforcement-use-face-recognition . ( Back to top)
  • “Criminal Justice Fact Sheet,” NAACP, May 24, 2021, https://naacp.org/resources/criminal-justice-fact-sheet . ( Back to top)
  • Laura Moy, “A Taxonomy of Police Technology’s Racial Inequity Problems,” U. Ill. L. Rev. 139 (2021), http://dx.doi.org/10.2139/ssrn.3340898 . ( Back to top)
  • Motion for a European Parliament resolution on artificial intelligence in criminal law and its use by the police and judicial authorities in criminal matters, 2020/2016(INI), European Parliament (adopted 2021), https://www.europarl.europa.eu/doceo/document/A-9-2021-0232_EN.html ?. ( Back to top)
  • The AI Act, COM/2021/206, European Commission (2021), https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CELEX:52021PC0206&from=EN . ( Back to top)
  • Anita L. Allen, “Dismantling the ‘Black Opticon’: Privacy, Race, Equity, and Online Data-Protection Reform,” The Yale Law Journal 131, November 16, 2021, https://www.yalelawjournal.org/forum/dismantling-the-black-opticon . ( Back to top)
  • Samuel D. Warren and Louis D. Brandeis, “Right to privacy,” Harv. L. Rev. 4 (1890): 193, https://www.cs.cornell.edu/~shmat/courses/cs5436/warren-brandeis.pdf . ( Back to top)
  • Nicandro Iannacci, “Recalling the Supreme Court’s Historic Statement on Contraception and Privacy,” National Constitution Center, June 7, 2019, https://constitutioncenter.org/blog/contraception-marriage-and-the-right-to-privacy . ( Back to top)
  • Elizabeth Goitein, “The government can’t seize your digital data. Except by buying it,” The Washington Post, April 26, 2021, https://www.washingtonpost.com/outlook/2021/04/26/constitution-digital-privacy-loopholes-purchases/ . ( Back to top)
  • Caitlin Chin, “Highlights: Setting Guidelines for Facial Recognition and Law Enforcement,” The Brookings Institution (blog), December 9, 2019, https://www.brookings.edu/blog/techtank/2019/12/09/highlights-setting-guidelines-for-facial-recognition-and-law-enforcement/ . ( Back to top)
  • Riley v. California, 573 U.S. 373 (2014). https://www.supremecourt.gov/opinions/13pdf/13-132_8l9c.pdf . ( Back to top)
  • Carpenter v. United States, 585 U.S. __ (2018). https://www.supremecourt.gov/opinions/17pdf/16-402_h315.pdf . ( Back to top)
  • Florida v. Riley, 488 U.S. 445 (1989). https://supreme.justia.com/cases/federal/us/488/445/ . ( Back to top)
  • Rebecca Darin Goldberg, “You Can See My Face, Why Can’t I? Facial Recognition and Brady,” Columbia Human Rights Law Review, April 12, 2021, http://hrlr.law.columbia.edu/hrlr-online/you-can-see-my-face-why-cant-i-facial-recognition-and-brady/ . ( Back to top)
  • Willie Allen Lynch v. State of Florida (2018). https://cases.justia.com/florida/first-district-court-of-appeal/2018-16-3290.pdf?ts=1545938765 ; Aaron Mak, “Facing Facts,” Slate, January 25, 2019, https://slate.com/technology/2019/01/facial-recognition-arrest-transparency-willie-allen-lynch.html . ( Back to top)
  • Long Lake Township v. Todd Maxon and Heather Maxon (2021). https://www.courts.michigan.gov/siteassets/case-documents/uploads/OPINIONS/FINAL/COA/20210318_C349230_47_349230.OPN.PDF ; Matthew Feeney, “Does the 4 th Amendment Prohibit Warrantless Drone Surveillance?” Cato Institute, March 24, 2021, https://www.cato.org/blog/does-4th-amendment-prohibit-warrantless-drone-surveillance . ( Back to top)
  • “Electronic Communications Privacy Act (ECPA),” Electronic Privacy Information Center, accessed February 24, 2022, https://epic.org/ecpa/ . ( Back to top)
  • Elizabeth Goitein, “How the CIA Is Acting Outside the Law to Spy on Americans,” Brennan Center for Justice, February 15, 2022, https://www.brennancenter.org/our-work/analysis-opinion/how-cia-acting-outside-law-spy-americans ; “‘Incidental,’ Not Accidental, Collection,” Electronic Frontier Foundation, October 2, 2017, https://www.eff.org/pages/Incidental-collection . ( Back to top)
  • “States Push Back Against Use of Facial Recognition by Police,” US News, May 5, 2021, https://www.usnews.com/news/politics/articles/2021-05-05/states-push-back-against-use-of-facial-recognition-by-police ; “General FR / Surveillance Regulation,” NYU School of Law, Policing Project, accessed September 24, 2022, https://www.policingproject.org/general-regulations . ( Back to top)
  • “Maine Enacts Strongest Statewide Facial Recognition Regulations in the Country,” American Civil Liberties Union, June 30, 2021, https://www.aclu.org/press-releases/maine-enacts-strongest-statewide-facial-recognition-regulations-country . ( Back to top)
  • Kim Lyons, “Minneapolis Prohibits Use of Facial Recognition Software by Its Police Department,” The Verge, February 13, 2021, https://www.theverge.com/2021/2/13/22281523/minneapolis-prohibits-facial-recognition-software-police-privacy . ( Back to top)
  • Cameron F. Kerry, “Why Protecting Privacy Is a Losing Game Today—and How to Change the Game,” The Brookings Institution (blog), July 12, 2018, https://www.brookings.edu/research/why-protecting-privacy-is-a-losing-game-today-and-how-to-change-the-game/ . ( Back to top)
  • “Sears Settles FTC Charges Regarding Tracking Software,” Federal Trade Commission, June 4, 2009, https://www.ftc.gov/news-events/press-releases/2009/06/sears-settles-ftc-charges-regarding-tracking-software ; “Facebook Settles FTC Charges That It Deceived Consumers By Failing To Keep Privacy Promises,” Federal Trade Commission, November 29, 2011, https://www.ftc.gov/news-events/press-releases/2011/11/facebook-settles-ftc-charges-it-deceived-consumers-failing-keep ; “FTC Approves Final Order Settling Charges Against Snapchat,” Federal Trade Commission, December 31, 2014, https://www.ftc.gov/news-events/press-releases/2014/12/ftc-approves-final-order-settling-charges-against-snapchat ; “Retail Tracking Firm Settles FTC Charges It Misled Consumers About Opt Out Choices,” Federal Trade Commission, April 23, 2015, https://www.ftc.gov/news-events/press-releases/2015/04/retail-tracking-firm-settles-ftc-charges-it-misled-consumers . ( Back to top)
  • Cameron F. Kerry and Caitlin Chin, “Hitting Refresh on Privacy Policies: Recommendations for Notice and Transparency,” The Brookings Institution (blog), January 6, 2020, https://www.brookings.edu/blog/techtank/2020/01/06/hitting-refresh-on-privacy-policies-recommendations-for-notice-and-transparency/ ; “Federal Trade Commission 2020 Privacy and Data Security Update,” Federal Trade Commission, 2020, https://www.ftc.gov/system/files/documents/reports/federal-trade-commission-2020-privacy-data-security-update/20210524_privacy_and_data_security_annual_update.pdf . ( Back to top )
  • Christopher Ward and Kelsey C. Boehm, “Developments in Biometric Information Privacy Laws,” Foley & Lardner LLP (blog), June 17, 2021, https://www.foley.com/en/insights/publications/2021/06/developments-biometric-information-privacy-laws . ( Back to top )
  • Julie Brill, “Microsoft Will Honor California’s New Privacy Rights throughout the United States,” Microsoft On the Issues (blog), November 11, 2019, https://blogs.microsoft.com/on-the-issues/2019/11/11/microsoft-california-privacy-rights/ . ( Back to top )
  • “Privacy & Requests,” Clearview AI, accessed February 24, 2022, https://www.clearview.ai/privacy-and-requests . ( Back to top )
  • “Facial Recognition Technology: Privacy and Accuracy Issues Related to Commercial Uses, U.S. Government Accountability Office, July 13, 2020, https://www.gao.gov/products/gao-20-522 . ( Back to top )
  • Eric Lander and Alondra Nelson, “ICYMI: WIRED (Opinion): Americans Need a Bill of Rights for an AI-Powered World,” The White House Office of Science and Technology (blog), October 22, 2021, https://www.whitehouse.gov/ostp/news-updates/2021/10/22/icymi-wired-opinion-americans-need-a-bill-of-rights-for-an-ai-powered-world/ . ( Back to top )
  • “Executive Order On Advancing Racial Equity and Support for Underserved Communities Through the Federal Government,” The White House, January 20, 2021, https://www.whitehouse.gov/briefing-room/presidential-actions/2021/01/20/executive-order-advancing-racial-equity-and-support-for-underserved-communities-through-the-federal-government/ . ( Back to top )
  • “IRS announces transition away from use of third-party verification involving facial recognition,” Internal Revenue Service, February 7, 2022, https://www.irs.gov/newsroom/irs-announces-transition-away-from-use-of-third-party-verification-involving-facial-recognition ; Alan Rappeport, “I.R.S. Will Allow Taxpayers to Forgo Facial Recognition Amid Blowback,” The New York Times, February 21, 2022, https://www.nytimes.com/2022/02/21/us/politics/irs-facial-recognition.html ; Rachel Metz, “IRS Halts Plans to Require Facial Recognition For Logging In To User Accounts,” CNN Business, February 7, 2022, https://www.cnn.com/2022/02/07/tech/irs-facial-recognition-idme/index.html . ( Back to top )
  • George Floyd Justice in Policing Act of 2021, H.R. 1280, 117 th Congress (2021-2022), https://www.congress.gov/bill/117th-congress/house-bill/1280/text . ( Back to top )
  • Facial Recognition and Biometric Technology Moratorium Act of 2021, S. 2052, 117 th Congress (2021-2022), https://www.congress.gov/bill/117th-congress/senate-bill/2052/text . ( Back to top )
  • Facial Recognition Technology Warrant Act of 2019, S. 2878, 116 th Congress (2019-2020), https://www.congress.gov/bill/116th-congress/senate-bill/2878/text . ( Back to top )
  • Fourth Amendment Is Not For Sale Act, S. 1265, 117 th Congress (2021-2022), https://www.congress.gov/bill/117th-congress/senate-bill/1265/text . ( Back to top )
  • Sara Morrison, “Here’s How Police Can Get Your Data — Even If You Aren’t Suspected of a Crime,” Vox, July 31, 2021, https://www.vox.com/recode/22565926/police-law-enforcement-data-warrant . ( Back to top )
  • Daniel E. Bromberg and Étienne Charbonneau, “Americans Want Police to Release Body-Cam Footage. But There’s a Bigger Worry,” The Washington Post, May 5, 2021, https://www.washingtonpost.com/politics/2021/05/05/americans-want-police-release-bodycam-footage-theres-bigger-worry/ . ( Back to top )
  • “State and Local Government,” The White House, accessed February 24, 2022, https://www.whitehouse.gov/about-the-white-house/our-government/state-local-government/ ; Alexis Karteron, “Congress Can’t Do Much about Fixing Local Police – but It Can Tie Strings to Federal Grants,” The Conversation, June 1, 2021, http://theconversation.com/congress-cant-do-much-about-fixing-local-police-but-it-can-tie-strings-to-federal-grants-159881 . ( Back to top )
  • Caitlin Chin, “Highlights: Setting Guidelines for Facial Recognition and Law Enforcement,” The Brookings Institution (blog), December 9, 2019, https://www.brookings.edu/blog/techtank/2019/12/09/highlights-setting-guidelines-for-facial-recognition-and-law-enforcement/ . ( Back to top )
  • Clare Garvie, Alvaro Bedoya, and Jonathan Frankle, “The Perpetual Line-Up: Unregulated Police Face Recognition in America,” Georgetown Law, Center on Privacy & Technology, October 18, 2016, https://www.perpetuallineup.org/appendix/model-police-use-policy . ( Back to top )
  • Ibid. ( Back to top)
  • Rashawn Ray, “Policy Steps for Racially-Equitable Policing,” Testimony before the Virginia Advisory Committee to the U.S. Commission on Civil Rights, July 16, 2021, https://www.brookings.edu/testimonies/policy-steps-for-racially-equitable-policing/ . ( Back to top )
  • Laura Moy, “A Taxonomy of Police Technology’s Racial Inequity Problems,” U. Ill. L. Rev. 139 (2021), http://dx.doi.org/10.2139/ssrn.3340898 . ( Back to top )
  • ”Cooperation or Resistance?: The Role of Tech Companies in Government Surveillance,” 131 Harv. L. Rev. 1715, 1722 (2018), https://harvardlawreview.org/2018/04/cooperation-or-resistance-the-role-of-tech-companies-in-government-surveillance/ . ( Back to top )
  • Angel Diaz, “Law Enforcement Access to Smart Devices,” Brennan Center for Justice, December 21, 2020, https://www.brennancenter.org/our-work/research-reports/law-enforcement-access-smart-devices . ( Back to top )
  • Cameron F. Kerry, John B. Morris, Jr., Caitlin Chin, and Nicol Turner Lee, “Bridging the gaps: A path forward to federal privacy legislation,” The Brookings Institution, June 3, 2020, https://www.brookings.edu/research/bridging-the-gaps-a-path-forward-to-federal-privacy-legislation/ . ( Back to top )
  • Cathy Cosgrove and Sarah Rippy, “Comparison of Comprehensive Data Privacy Laws in Virginia, California and Colorado,” International Association of Privacy Professionals, July 2021, https://iapp.org/media/pdf/resource_center/comparison_chart_comprehensive_data_privacy_laws_virginia_california_colorado.pdf ; General Data Protection Regulation (2016) https://gdpr-info.eu/ ; Consumer Online Privacy Rights Act, S. 3195, 117 th Congress (2021-2022), https://www.congress.gov/bill/117th-congress/senate-bill/3195 ; SAFE DATA Act, S. 2499, 117 th Congress (2021-2022), https://www.congress.gov/bill/117th-congress/senate-bill/2499 . ( Back to top )
  • “Brown Releases New Proposal That Would Protect Consumers’ Privacy from Bad Actors,” Sherrod Brown, U.S. Senator for Ohio, June 18, 2020, https://www.brown.senate.gov/newsroom/press/release/brown-proposal-protect-consumers-privacy ; SAFE DATA Act, S. 2499, 117 th Congress (2021-2022), https://www.congress.gov/bill/117th-congress/senate-bill/2499 . ( Back to top )
  • Dillon Reisman, Jason Schultz, Kate Crawford, and Meredith Whittaker, “Algorithmic impact assessments: A practical framework for public agency accountability,” AI Now Institute, 2018, https://ainowinstitute.org/aiareport2018.pdf . ( Back to top )
  • “Wyden, Booker and Clarke Introduce Algorithmic Accountability Act of 2022 To Require New Transparency And Accountability For Automated Decision Systems,” Ron Wyden, U.S. Senator for Oregon, February 3, 2022, https://www.wyden.senate.gov/news/press-releases/wyden-booker-and-clarke-introduce-algorithmic-accountability-act-of-2022-to-require-new-transparency-and-accountability-for-automated-decision-systems . ( Back to top )

Governance Studies

Center for Technology Innovation

The Brookings Institution, Washington DC

10:00 am - 11:00 am EDT

Darrell M. West, Nicol Turner Lee

August 19, 2024

Valerie Wirtschafter, Derek Belle

July 25, 2024

research paper on government surveillance

Academia.edu no longer supports Internet Explorer.

To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to  upgrade your browser .

  •  We're Hiring!
  •  Help Center

Government Surveillance

  • Most Cited Papers
  • Most Downloaded Papers
  • Newest Papers
  • Urban Education, Critical Theory/Pedagogy, Critical Literacies, Youth Culture, Hip Hop Culture, Curriculum & Development Follow Following
  • Policy Debate Follow Following
  • Neo-Marxism Follow Following
  • Biopower and Biopolitics Follow Following
  • Gilles Deleuze and Felix Guattari Follow Following
  • Slavoj Žižek Follow Following
  • Political Ecology Follow Following
  • Urban Geography Follow Following
  • Surveillance Studies Follow Following
  • Marxism Follow Following

Enter the email address you signed up with and we'll email you a reset link.

  • Academia.edu Journals
  •   We're Hiring!
  •   Help Center
  • Find new research papers in:
  • Health Sciences
  • Earth Sciences
  • Cognitive Science
  • Mathematics
  • Computer Science
  • Academia ©2024

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Dela J Public Health
  • v.6(2); 2020 Jul

Logo of djph

The Power of Public Health Surveillance

Never has an emergency battered Delaware to such public health, economic, social, and emotional extremes like the one presented by coronavirus disease 2019 (COVID-19). Strict disease mitigation strategies were led by Governor John Carney’s March 22, 2020 State of Emergency declaration that closed non-essential businesses and schools, and included a Stay-at-Home order. As of June 11, 2020, the state is experiencing fewer hospitalizations and deaths due to COVID-19. The decreasing trends in the percentage of positive COVID-19 cases and hospitalizations 1 were the result of many statewide infection control measures such as closures of non-essential businesses, use of face coverings, social distancing, general hand hygiene, and community testing. As Delaware reopens in phases, the Delaware Department of Health and Social Services, Division of Public Health (DPH) – the state’s lead health agency – is conducting public health surveillance. Case investigations and contact tracing have impacted disease transmission rates by identifying those needing isolation or quarantine. These measures will continue as our society moves towards normalcy.

Public Health Approach

Public health issues are diverse and dynamic, involving many significant public health threats such as infectious diseases, chronic diseases, emergencies, injuries, and environmental health problems. 2 A public health concern should be addressed by one consistent approach, similar to an all-hazards response in disaster management regardless of the type of event ( Figure 1 ).

An external file that holds a picture, illustration, etc.
Object name is djph-62-016-f1.jpg

A Public Health Approach 2

A potential public health problem can be identified using surveillance systems to monitor health events and behaviors within communities and populations. Once identified, risk factors leading to the problem – human behaviors, environmental factors, medical conditions, and social determinants – are evaluated. Interventions are then considered to address directly the problem or to focus indirectly on risk factors associated with the concern. For example, during COVID-19, risk factors for increased transmission and complications from disease include unknown personal status of infection or exposure, non-compliance with isolation or quarantine, inability to social distance in the home environment, chronic medical conditions, and access barriers to testing. Interventions include near real time notification of positive cases, identification and outreach to their close contacts, active monitoring of those isolated or quarantined, hotel accommodations for those who cannot comply with social distancing at home, focused public messaging for those with chronic medical conditions to follow stay-at-home orders, and community testing sites to accommodate vulnerable populations. The final step is to implement interventions and evaluate their effectiveness.

Public Health Core Sciences

Public health requires expertise and resources to address successfully public health problems using scientific methods. Public health surveillance methods monitor a public health situation. Epidemiology is the study of distribution and determinants of health-related states among specified populations and the application of that study to the control of health problems. Epidemiologists work closely with laboratories to assist with the identification of cases through testing. Given the vast amount of data in public health surveillance and investigations, public health informatics is critical beyond timely data management to include the conceptualization, design, development, deployment, refinement, maintenance, and evaluation of communication, surveillance, information, and learning systems relevant to public health. 3 Prevention effectiveness studies provide information to allow for decision-making regarding intervention options. The following five elements collaboratively guide DPH in its approaches to address public health issues.

Public Health Surveillance

Public health surveillance is the ongoing systematic collection, analysis, and interpretation of health-related data essential to planning, implementation, and evaluation of public health practice, closely integrated with the timely dissemination of these data to those responsible for prevention and control. The effectiveness of surveillance has been documented back in 1854, when Dr. John Snow, referred by many as the “father of field epidemiology,” collected information from hospital and public records to determine that contaminated water from the Broad Street pump was the cause of the cholera outbreak in Soho, London. The goal of public health surveillance is to provide information for public health personnel, government leaders, and the public to guide public health policy and programs. Uses of public health surveillance include identifying patients and their contacts for treatment and intervention of infectious diseases; detecting epidemics, health problems, and changes in health behaviors; estimating the magnitude and scope of health problems; measuring trends and characterizing disease; monitoring changes in infectious and environmental agents; assessing the effectiveness of programs and control measures; developing hypotheses; and stimulating research.

The Delaware Electronic Reporting and Surveillance System is the state-based electronic surveillance system that receives information of significant public health concerns from various community partners such as hospitals, health care providers, and laboratories. DPH is directly responsible for all case investigation and contact tracing for infectious disease cases of significant public health concern. Although public health surveillance may conflict with individual liberties, public welfare must be balanced with individual needs with laws and regulations that allow the state health officer to mandate the reporting of specific diseases or conditions. It is important that the surveillance system be effective with attributes such as usefulness, data quality, timeliness, flexibility, simplicity, stability, sensitivity, predictive value positive, representativeness, and acceptability.

There are two main categories of surveillance: passive and active. Passive surveillance relies on health care partners to report diseases and conditions to DPH. Although this method is simple and inexpensive, it can be limited by incompleteness of reporting based on participation and variability in data quality. Active surveillance ensures more complete reporting of diseases and conditions, as DPH directly contacts health care providers and/or patients for case information. This method is used in conjunction with specific epidemiologic investigations for an identified disease or event.

To target a specific geographic area or population, DPH partners with specific health professionals to conduct sentinel surveillance. This type of public health surveillance collects data from a smaller selected group of health care providers, known as sentinel providers. Data collected and reported by sentinel providers are used to identify and quantify health events that may occur among high risk populations and provide situational awareness regarding a health event in the larger population or geographic area. 4 Delaware’s COVID-19 sentinel surveillance serves as a tool to describe and monitor the spread of the virus in vulnerable populations across the state with an emphasis on mitigating the spread of the virus by identifying individuals with mild or asymptomatic infection. Sentinel surveillance of COVID-19 is an integral component of Delaware’s Reopening Plan. The COVID-19 sentinel provider network consists primarily of Federally Qualified Health Centers and other health care providers serving vulnerable populations, as well as Long Term Care facilities.

Surveillance may monitor for symptoms rather than provider-diagnosed or laboratory-confirmed cases for more timely data collection to detect, understand, and monitor health events. Known as syndromic surveillance, an example of this approach is using the Influenza-like Illness Surveillance Network (ILINet) to track cases of influenza-like illness to guide public health activity. Delaware’s COVID-19 sentinel surveillance builds on ILINet, a program conducted by the U.S. Centers for Disease Control and Prevention (CDC) and state health departments to collect influenza surveillance data from volunteer sentinel health care providers. Providers who participate in the ILINet program collect and report information about the level of influenza-like illness (ILI) currently seen in their practices. Data reported by ILINet providers, in combination with other influenza surveillance data, provide a national picture of influenza and ILI activity in the U.S. 5 There are more than 2,900 ILINet sentinel providers in all 50 states, Puerto Rico, the District of Columbia, and the U.S. Virgin Islands. The advantages of using syndromic surveillance are reduced reporting burden, more timely and complete information, consistently applied criteria (e.g., CDC case definition), and year-round monitoring. 6 Using symptoms for early detection allows DPH to initiate quickly public health investigations and infection control measures. For example, certain diseases such as influenza or those associated with bioterrorism may not require a laboratory-confirmed diagnosis for initial treatment.

Overall, the surveillance process involves data collection, data analysis, data interpretation, data dissemination, and link to action. However, before committing to data collection, the surveillance goal must be determined. There are many data sources for public health surveillance, including provider reports of laboratory-confirmed cases or suspected syndromic cases, electronic health records such as the DHIN health information exchange platform, vital statistics records such as death certificates, health registries such as the Delaware Immunization Registry (DelVAX), and surveys. Data analysis and interpretation are closely linked; interpreting investigative information such as the person, place, and time of the case can more easily determine how and why the health event happened. Data dissemination is directed by the target audiences. For instance, health alerts inform clinicians and other health care providers, whereas press releases and social media are for the general public. Surveillance efforts must lead to an action or response, including a description of the disease burden or potential; the monitoring of trends and patterns in disease, risk factors, and agents; the detection of sudden changes in disease occurrence and distribution; the provision of data for programs, policies, and priorities; and an evaluation of prevention and control efforts. Data without a plan of action do not justify the resources invested into the initial data collection.

Public Health Laboratory Role in Surveillance

DPH’s Delaware Public Health Laboratory (DPHL) has a critical role in disease surveillance programs that focus on identifying diseases in state populations. DPHL tests collected samples to identify newly emerging or recurring disease outbreaks, delivering results through shared networks used by the CDC and other state public health laboratories. Historically, DPHL has developed and implemented systems that can be quickly activated in response to critical needs related to public health surveillance. Generally, this is done by facilitating data production (test results) to assess high risk groups without causing laboratory system overloads.

Scientific analysis takes anywhere from a few weeks to over 12 months, depending on the complexity and level of testing needed, to develop testing methods, validate methodology for reliability, and set up sensitive laboratory instrumentation. Once the methodology is validated and determined reliable, efforts turn to the automation of results and the production of data. The ability to optimize turn-around times (TAT) and produce accurate data is critical to public health community response efforts. Throughout the COVID-19 pandemic, DPHL has served as a primary testing laboratory for hospitals and clinics that identified COVID-19 patients. Once DPHL developed methods and established reliability, it achieved a turnaround time for results within 24 hours of receiving the specimen. DPHL was the first laboratory in Delaware to verify CDC’s diagnostic method for detecting SARS-CoV-2 (SC2), the virus that causes COVID-19. To expand on this scientific method, it should be noted that this test calls for the performance of a high complex polymerase chain reaction (PCR) test that can only be performed by federally certified laboratorians. This high sensitivity process involves amplifying (making copies of) targeted viral RNA strands to identify SC2. The amplification process is continuously repeated until enough sample is produced to allow for a detectable fluorescent response. Once the response is detectable, laboratory instruments measure the intensity of fluorescence to produce the final test results.

Over the last few months, the need for high-throughput automated systems became more apparent based on the projected demand for testing. Also, DPHL’s ability to re-designate instrumentation to alternative methods when needed was critical to the surveillance response as the demand for testing increased. Within this year, DPHL plans to transition to data production using sophisticated instrumentation such as the Illumina MiSeqs for “Next Generation Sequencing” to provide for more comprehensive and retrospective data centered on the identity and behavior of epidemic and pandemic organisms. The goal of this initiative is to provide information that can be utilized to better target epidemiological surveillance investigations.

Contact Tracing

Case investigation and contract tracing are critical components to prevent further spread of infectious diseases such as COVID-19. These methods support patients with suspected or confirmed infection and potential contacts, those who have been exposed to a case or a case’s environment such that they had an opportunity to acquire the infection. Certain high-risk subpopulations, segments of the population with characteristics that increase the risk of infection or severe disease, need to be identified quickly to prevent further spread of disease. As part of the case investigation, contact tracing identifies those with close contact to positive individuals during the infectious period, the period of time during which a case is able to transmit a disease to others, as they are at higher risk of being infected, becoming infected, and potentially infecting others. 7 Since those exposed may not present with evidence of infection due to the incubation period between the time of invasion by an infectious agent and appearance of the first sign or symptoms of the disease, quarantine is an effective option to limit spread of disease when implemented prior to the infectious period. DPH’s contact tracers give close contacts of COVID-19 positive persons information about the disease, education about risks and transmission, and recommendations to reduce further spread of disease, including separation from others, self-monitoring of symptoms, and other infection control measures. Identifying contacts early so they do not expose others is vital to limiting community spread, especially with the concern for asymptomatic or pre-symptomatic spread. By decreasing the reproduction number (R 0 ), the average number of people who will contract a disease from one infected case, the disease will burn out when the each infected case causes fewer than one new infection (see Figure 2 ). The contact tracing process should also include monitoring for symptoms throughout the quarantine period (14 days for COVID-19).

An external file that holds a picture, illustration, etc.
Object name is djph-62-016-f2.jpg

A Public Health Approach 8

As Delaware progresses through its reopening phases, DPH’s surveillance ensures that the public remains safe and healthy. Surveillance allows DPH to provide informed recommendations around a phased re-opening approach to best mitigate risk for re-introducing spread of the virus throughout the community. DPH remains vigilant for any resurgence of cases that could lead to the re-implementation of strict mitigation strategies to contain the infection once again, including the closure of businesses. 9 All efforts ultimately depend on how well Delawareans follow the COVID-19 guidance to prevent disease transmission.

Numbers, Facts and Trends Shaping Your World

Read our research on:

Full Topic List

Regions & Countries

  • Publications
  • Our Methods
  • Short Reads
  • Tools & Resources

Read Our Research On:

  • Government Surveillance: A Question Wording Experiment

Court Approval a Big Factor in Public Support

Table of contents.

  • APPENDIX: THE SURVEY QUESTION

In the wake of leaked information about the government’s telephone and digital surveillance programs last month, public opinion surveys reported a wide range of reactions. For example, a Pew Research Center/Washington Post survey conducted immediately after the revelations found broad support for the program, while a Gallup survey conducted just days later found more disapproval than approval. These, along with a number of other surveys during that period, all made an effort to describe the program as accurately and neutrally as possible, yet different question wording clearly produced different responses.

Four Experimental Treatments of Government Surveillance

To better understand how the manner in which the government’s surveillance program is described affects public evaluations, the Pew Research Center conducted a question wording experiment in a national telephone survey fielded between July 11 and 21, 2013 among 2,002 adults. The survey respondents were asked whether they would favor or oppose a government data collection program, but the wording of four elements of the program were described differently to different groups of respondents. These are: whether metadata or content is being collected; whether phone calls or emails are being monitored; whether the program has court approval; and whether the program is part of anti-terrorism efforts.

Mentioning the role of courts and describing the program as part of anti-terrorism efforts each had a substantial effect on public sentiment. Among the roughly 1,000 respondents who heard the government surveillance program described as occurring “with court approval,” support was 12 points higher than among the other 1,000 who heard no mention of courts. This is consistent with the findings of a separate Pew Research Center survey , which found that people’s impressions of whether or not there is adequate court oversight of the program are more strongly linked to overall support an opposition than are other perceptions.

2 Mentioning Court Approval, Terrorism Increases NSA Support

Mentioning the goal of terrorism also affects the level of public support. When the surveillance was described as “part of anti-terrorism efforts” it garnered 9% more support than when this goal was not mentioned.

Describing the government as collecting metadata, such as the date, time, phone numbers and email addresses, drew more approval than when the program was described as collecting the actual recordings of phone calls or the text of emails.

By contrast, the distinction between a program targeting phone communications and email communications appears to have no effect. There was almost the same reaction when a program was described as monitoring phone calls as when it was described as monitoring email communication.

The combination of these four wording tests produces a total of 16 possible program descriptions 1 , which can be ranked in terms of public support. For simplicity’s sake, since the telephone/email distinction did not elicit different responses, we have removed it here to narrow the range to eight possible descriptions, with roughly 250 respondents (one-eighth of the overall sample) in each group.

3 Views of Government Data Collection

Combined in this way, respondents who heard the program described as collecting only “data such as the date, time phone numbers and e-mails… with court approval as part of anti-terrorism efforts” were the most supportive: 41% said they would favor this kind of program. By contrast, only 16% favored a program they heard described as collecting recordings of phone calls or the text of emails with no mention of either courts or the goal of fighting terrorism – fully 25-points lower than support when these other considerations are mentioned.

Under every condition in this experiment more respondents oppose than favor the program. This stands in contrast to our new survey in which slightly more approve (50%) than disapprove (44%) of “the government’s collection of telephone and internet data as part of anti-terrorism efforts.” This disparity picks up on another element of question wording that can affect people’s evaluations: whether they are being asked to evaluate a program that is already in place or a program that might be put in place. Because this experiment included varying descriptions of the program that may or may not apply to the existing NSA program, we asked the question in the conditional verb tense: “ Would you favor or oppose the government collecting data…” The other survey, which was intended to test support for the actual program, uses the present verb tense: “ Do you approve or disapprove of the government’s collection of data…” While this distinction may seem minor, it raises a fundamentally different consideration in people’s minds: the difference between something the government might do as opposed to something it is doing .

The experiment found little difference in how partisan and age groups reacted to the different programs descriptions. For example, the distinction between monitoring phone vs. email communications had no effect on any age group, while mentioning court approval raised support by similar levels across all age groups.

Similarly, the wording experiments drew similar reactions across partisan lines. Mentioning anti-terrorism efforts received comparably greater support for the program from both Republicans and Democrats alike.

  •  There were four concepts tested each with two different descriptions, combining to form 16 possible combinations: (2 x 2 x 2 x 2 = 16) ↩

Sign up for our weekly newsletter

Fresh data delivery Saturday mornings

Sign up for The Briefing

Weekly updates on the world of news & information

  • Online Privacy & Security
  • Privacy Rights
  • Survey Methods

Key findings about Americans and data privacy

How americans view data privacy, what americans know about ai, cybersecurity and big tech, majority of americans say tiktok is a threat to national security, as ai spreads, experts predict the best and worst changes in digital life by 2035, most popular.

901 E St. NW, Suite 300 Washington, DC 20004 USA (+1) 202-419-4300 | Main (+1) 202-857-8562 | Fax (+1) 202-419-4372 |  Media Inquiries

Research Topics

  • Email Newsletters

ABOUT PEW RESEARCH CENTER  Pew Research Center is a nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping the world. It conducts public opinion polling, demographic research, media content analysis and other empirical social science research. Pew Research Center does not take policy positions. It is a subsidiary of  The Pew Charitable Trusts .

© 2024 Pew Research Center

The Ethics of Surveillance

Introduction to surveillance.

Surveillance is, simply put, the observation and/or monitoring of a person. Coming from the French word for "looking upon," the term encompasses not only visual observation but also the scrutiny of all behavior, speech, and actions. Prominent examples of surveillance include surveillance cameras, wiretaps, GPS tracking, and internet surveillance.

One-way observation is in some ways an expression of control. Just as having a stranger stare at you for an extended period of time can be uncomfortable and hostile, it is no different from being under constant surveillance, except that surveillance is often done surreptitiously and at the behest of some authority.

Todays technological capabilities take surveillance to new levels; no longer are spyglasses and "dropping" from the eaves of a roof necessary to observe individuals - the government can and does utilize methods to observe all the behavior and actions of people without the need for a spy to be physically present. Clearly, these advances in technology have a profound impact with regards to the ethics of placing individual under surveillance&emdash;in our modern society, where so many of our actions are observable, recorded, searchable, and traceable, close surveillance is much more intrusive than it has been in the past.

Surveillance and Physical Searches

Particularly interesting about government surveillance is that in the United States surveillance is not held to the same standards of accountability&emdash;as the Constitution protects American citizens from unreasonable searches and seizures, physical searches of individuals may not be conducted without a warrant issued by a judge. However, after the passage of FISA and subsequent laws, citizens have not been given the same protection with regards to electronic surveillance. As there have been massive changes in technology and lifestyle since the 1970s, electronic surveillance could be considered much more invasive than a physical search, yet as has been made clear in the legal section of this website, it is in fact much easier for government agents to perform surveillance. Why there is such disparity between these standards to us a matter of serious concern.

"If you haven't done anything wrong, you have nothing to fear."

This is a typical argument used by governments and other groups to justify their spying activities. Upon cursory inspection, it seems to make sense&emdash;as most people are law-abiding citizens, most ostensibly will not be targeted for surveillance and it will not impact their lives, while making their lives more comfortable and safer through the elimination of criminals. Thus, the government's use of closed-circuit television cameras in public spaces, warrantless wiretapping, and library record checks have the potential to save lives from criminals and terrorists with only minimal invasion of its citizens' privacy.

First, as a mental exercise, we ask that the reader consider that these arguments could easily be applied to asking all citizens to carry location tracking devices&emdash;it would make tracing criminal acts much easier, and that it could easily be argued that people refusing to carry these devices only do so because they have something to hide. It is a matter of course that most people in our society would object to this solution, not because they wish to commit any wrongdoings, but because it is invasive and prone to abuse. Now consider that, given current technology, the government already has the ability to track a known target's movements to a reasonable degree, and has easy access to information such as one's purchasing habits, online activities, phone conversations, and mail. Though implementing mandatory location tracking devices for the whole population is certainly more invasive than the above, we argue that current practices are analogous, extreme, and equally unacceptable.

Next, this argument fails to take into consideration a number of important issues when collecting personally identifiable data or recordings&emdash;first, that such practices create an archive of information that is vulnerable to abuse by trusted insiders; one example emerged in September of 2007 when Benjamin Robinson, a special agent of the Department of Commerce, was indicted for using a government database called the Treasury Enforcement Communications System (TECS) for tracking the travel patterns of an ex-girlfriend and her family. Records show that he used the system illegally at least 163 times before he was caught (Mark 2007). With the expansion of surveillance, such abuses could become more numerous and more egregious as the amount of personal data collected increases.

In addition, allowing surreptitious surveillance of one form, even limited in scope and for a particular contingency, encourages government to expand such surveillance programs in the future. It is our view that the danger of a "slippery slope" scenario cannot be dismissed as paranoia - as a prominent example, the collection of biometric has expanded immensely in the past several years. Many schools in the UK collect fingerprints of children as young as six without parental consent (Doward 2006), and fingerprinting in American schools has been widespread since the mid-eighties (NYT National Desk 1983). Now, the discussion has shifted towards DNA collection&emdash;British police are now pushing for the DNA collection of children who "exhibit behavior indicating they may become criminals in later life" (Townsend and Asthana 2008), while former New York City mayor Rudy Giuliani has encouraged the collection of DNA data of newborns (Lambert 1998).

When data is collected, whether such data remains used for its stated purpose after its collection has been called into question, even by government officials: the European Data Protection Supervisor has acknowledged that even when two databases of information are created for specific, distinct purposes, in a phenomenon known as 'function creep' they could be combined with one another to form a third with a purpose for which the first two were not built (eGov Monitor Weekly 2006). This non-uniqueness and immutability of information provides great potential for abuse by individuals and institutions.

When is surveillance appropriate?

A. the means.

Harm: does the technique cause unwarranted physical or psychological harm?

Boundary: does the technique cross a personal boundary without permission (whether involving coercion or deception or a body, relational or spatial border)?

Trust: does the technique violate assumptions that are made about how personal information will be treated such as no secret recordings?

Personal relationships: is the tactic applied in a personal or impersonal setting?

Invalidity: does the technique produce invalid results?

B. The Data Collection Context

Awareness: are individuals aware that personal information is being collected, who seeks it and why?

Consent: do individuals consent to the data collection?

Golden rule: would those responsbile for the surveillance (both the decision to apply it and its actual application) agree to be its subjects under the conditions in which they apply it to others?

Minimization: does a principle of minimization apply?

Public decision-making: was the decision to use a tactic arrived at through some public discussion and decision making process?

Human review: is there human review of machine generated results?

Right of inspection: are people aware of the findings and how they were created?

Right to challenge and express a grievance: are there procedures for challenging the results, or for entering alternative data or interpretations into the record?

Redress and sanctions: if the individual has been treated unfairly and procedures violated, are there appropriate means of redress? Are there means for discovering violations and penalties to encourage responsible surveillant behavior?

Adequate data stewardship and protection: can the security of the data be adequately protected?

Equality-inequality regarding availability and application: a) is the means widely available or restricted to only the most wealthy, powerful or technologically sophisticated? b) within a setting is the tactic broadly applied to all people or only to those less powerful or unable to resist c) if there are means of resisting the provision of personal information are these equally available, or restricted to the most privileged?

The symbolic meaning of a method: what does the use of a method communicate more generally?

The creation of unwanted precedents: is it likely to create precedents that will lead to its application in undesirable ways?

Negative effects on surveillors and third parties: are there negative effects on those beyond the subject?

Beneficiary: does application of the tactic serve broad community goals, the goals of the object of surveillance or the personal goals of the data collector?

Proportionality: is there an appropriate balance between the importance of the goal and the cost of the means?

Alternative means: are other less costly means available?

Consequences of inaction: where the means are very costly, what are the consequences of taking no surveillance action?

Protections: are adequate steps taken to minimize costs and risk?

Appropriate vs. inappropriate goals: are the goals of the data collection legitimate?

The goodness of fit between the means and the goal: is there a clear link between the information collected and the goal sought?

Information used for original vs. other unrelated purposes: is the personal information used for the reasons offered for its collection and for which consent may have been given and does the data stay with the original collector, or does it migrate elsewhere?

Failure to share secondary gains from the information: is the personal data collected used for profit without permission from, or benefit to, the person who provided it?

Unfair disadvantage: is the information used in such a way as to cause unwarranted harm or disadvantage to its subject?

In general, we feel that surveillance can be ethical, but that there have to exist reasonable, publicly accessible records and accountability for those approving and performing the surveillance in question.

  • About The Journalist’s Resource
  • Follow us on Facebook
  • Follow us on Twitter
  • Criminal Justice
  • Environment
  • Politics & Government
  • Race & Gender

Expert Commentary

The effect of CCTV on public safety: Research roundup

Updated in 2014, this review of literature on the effectiveness of surveillance cameras against crime includes a 2009 meta-analysis by Northeastern and the University of Cambridge.

research paper on government surveillance

Republish this article

Creative Commons License

This work is licensed under a Creative Commons Attribution-NoDerivatives 4.0 International License .

by Leighton Walter Kille and Martin Maximino, The Journalist's Resource February 11, 2014

This <a target="_blank" href="https://journalistsresource.org/politics-and-government/surveillance-cameras-and-crime/">article</a> first appeared on <a target="_blank" href="https://journalistsresource.org">The Journalist's Resource</a> and is republished here under a Creative Commons license.<img src="https://journalistsresource.org/wp-content/uploads/2020/11/cropped-jr-favicon-150x150.png" style="width:1em;height:1em;margin-left:10px;">

Millions of closed-circuit television (CCTV) cameras are installed in streets and businesses throughout the world with the stated goal of reducing crime and increasing public safety. The United Kingdom is one of the most enthusiastic proponents, with an estimated 1.9 million cameras in 2011 — one for every 32 U.K. residents — and the number continues to rise . Chicago reportedly has at least 15,000 cameras installed in one of the largest U.S. networks — which has prompted civil liberties groups to express strong concerns — while in New York, cameras are increasingly found both on public transit as well as in businesses and even high-end residences . The 9/11 attacks led many municipalities to start installing CCTV systems, but sometimes what’s put in place goes beyond the original mandate: For example, Oakland, Calif., took $7 million of federal money intended for safeguarding its port and is using it to create a citywide surveillance system instead.

According to industry estimates , the global video surveillance market is expected to grow from $11.5 billion in 2008 to $37.7 billion in 2015. A 2013 New York Times /CBS poll found that 78% of respondents supported the use of surveillance cameras in public places, and authorities tend to point to spectacular successes — for example, crucial images cameras provided of the Boston Marathon bombing suspects or the identification of those responsible for the 2005 London attacks. Still, concerns remain about systems’ potential to violate personal privacy as well as their overall cost-effectiveness. A 2013 Chicago Tribune opinion piece quoted a city spokesman as saying that surveillance cameras helped solve 4,500 crimes over four years, but the writer notes that more than a million are estimated to have taken place over that time period — meaning that the cameras’ contribution was 0.05% at best.

CCTV cameras also have the potential of creating unintended effects, good and bad. The “halo effect” refers to the potential for greater security in areas outside the view of cameras; this could be offset by the “displacement effect,” which pushes antisocial activity to other parts of the city. Cameras could also promote a false sense of security and lead citizens to take fewer precautions, or they could also cause more crimes to be reported, and thus lead to a perceived increase in crime. And as with the 2013 revelations of widespread data collection by the U.S. National Security Administration, the indiscriminate gathering of information on law-abiding citizens, however well-intentioned, has the potential for misuse. The Washington Post reported in February 2014 that new aerial video surveillance technologies are being deployed that can monitor virtually everything in an area the size of a small city.

A 2010 document from the European Forum for Urban Security, “Charter for a Democratic Use of Video-Surveillance,” provides a useful overview of the issues at stake as well as a set of principles and tools to ensure that citizens’ rights are respected with CCTV systems. These include:

  • Necessity: The use of camera systems must be justified empirically, ideally by an independent authority. Objectives and intended outcomes must be defined.
  • Proportionality: CCTV equipment must be appropriate for the problem it is intended to address. Technology should “respond to the established objectives, without going further.” Data should be protected and the length of time it is retained be clearly defined.
  • Transparency: Citizens should know what the objectives of a CCTV system are, what its installation and operational costs are, the areas being surveyed, and what the results are. Reports should occur regularly so citizens can make informed decisions.
  • Accountability: Those in charge of public CCTV systems should be clearly identified and accountable to the public, whether the systems are run by the government or private firms.
  • Independent oversight: An external body should be charged with ensuring that systems respect the public’s rights and are achieving their stated objectives. Ideally citizens would have a voice in the oversight process.

Below is a selection of studies that shed light on the use of CCTV cameras, in particular their effects on crime. The term “viewshed” is used in many of the studies, and refers to the area visible to cameras from their fixed locations.

——————-

“Analyzing the Influence of Micro-Level Factors on CCTV Camera Effect” Piza, Eric L.; Caplan, Joel M.; Kennedy, Leslie W .  Journal of Quantitative Criminology , June 2014, Volume 30, Issue 2, pp. 237-264.

Abstract: “Objectives: Despite the popularity of closed circuit television (CCTV), evidence of its crime prevention capabilities is inconclusive. Research has largely reported CCTV effect as “mixed” without explaining this variance. The current study contributes to the literature by testing the influence of several micro-level factors on changes in crime levels within CCTV areas of Newark, NJ. Methods: Viewsheds, denoting the line-of-sight of CCTV cameras, were units of analysis (N = 117). Location quotients, controlling for viewshed size and control-area crime incidence, measured changes in the levels of six crime categories, from the pre-installation period to the post-installation period. Ordinary least squares regression models tested the influence of specific micro-level factors—environmental features, camera line-of-sight, enforcement activity, and camera design—on each crime category. Results: First, the influence of environmental features differed across crime categories, with specific environs being related to the reduction of certain crimes and the increase of others. Second, CCTV-generated enforcement was related to the reduction of overall crime, violent crime and theft-from-auto. Third, obstructions to CCTV line-of-sight caused by immovable objects were related to increased levels of auto theft and decreased levels of violent crime, theft from auto and robbery. Conclusions: The findings suggest that CCTV operations should be designed in a manner that heightens their deterrent effect. Specifically, police should account for the presence of crime generators/attractors and ground-level obstructions when selecting camera sites, and design the operational strategy in a manner that generates maximum levels of enforcement.”

“Public Area CCTV and Crime Prevention: An Updated Systematic Review and Meta-Analysis” Welsh, Brandon C.; Farrington, David P. Justice Quarterly , October 2009, Vol. 26, No. 4.

Summary: This meta-analysis examined 93 studies on surveillance systems to see how effective they are at reducing crime and deemed 44 to be sufficiently rigorous for inclusion. Many of the studies were based in the United Kingdom, while others were in U.S. cities such as Cincinnati and New York. The analysis found that surveillance systems were most effective in parking lots, where their use resulted in a 51% decrease in crime. Systems in other public settings had some effect on crime — a 7% decrease in city centers and in public housing communities, and a 23% drop in public transit systems — but the results weren’t statistically significant. When sorted by country, systems in the United Kingdom accounted for the majority of the decrease; the drop in other countries was insignificant. The study concludes that while surveillance cameras can be effective in specific contexts such as parking lots and public-transit systems, the potential financial and societal costs require greater research.

“Here’s Looking at You: An Evaluation of Public CCTV Cameras and Their Effects on Crime and Disorder” McLean, Sarah J.; Worden, Robert E.; Kim, MoonSun. Criminal Justice Review, July 2013. doi: 10.1177/0734016813492415.

Abstract: “We examine the impacts of public surveillance cameras on crime and disorder in Schenectady, N.Y., a medium-sized city in the northeastern United States. We assessed camera impacts by analyzing monthly counts of crime and disorder-related calls for service that occurred within each camera’s 150-foot viewshed as an interrupted time series, with the interruption at the time that the camera in question was activated. We also analyzed counts of incidents between 150 and 350 feet of cameras to assess displacement effects and diffusion of benefits. We further estimated camera effects on counts of only incidents in public locations — street crimes. Our study suggests that cameras have had effects on crime, even more consistent effects on disorder, and that the visibility of cameras is associated with its impact on crime and disorder. We conclude by discussing the implications of the findings and discuss the questions to which future research should be directed.

“Police-monitored CCTV Cameras in Newark, N.J.: A Quasi-experimental Test of Crime Deterrence” Caplan, Joel M.; Kennedy, Leslie W.; Petrossian, Gohar. Journal of Experimental Criminology , September 2011, Vol. 7, Issue 3, 255-274. doi: 10.1007/s11292-011-9125-9.

Abstract: “Using camera installation sites and randomly selected control sites, [we] assessed the impact of CCTV on the crimes of shootings, auto thefts, and thefts from autos in Newark, N.J., for 13 months before and after camera installation dates. Strategically placed cameras were not any different from randomly placed cameras at deterring crime within their viewsheds; there were statistically significant reductions in auto thefts within viewsheds after camera installations; there were significant improvements to location quotient values for shootings and auto thefts after camera installations. There was no significant displacement and there was a small diffusion of benefits, which was greater for auto thefts than shootings. The system of cameras in Newark is not as efficient as it could be at deterring certain street crimes; some camera locations are significantly more effective than others.”

“CCTV and Crime Displacement: A Quasi-experimental Evaluation” Cerezo, Ana. European Journal of Criminology , March 2013, Vol. 10, No. 2, 222-236. doi: 10.1177/1477370812468379.

Abstract: “The installation of CCTV cameras in the historic centre of Malaga [Spain] in March 2007 was the main crime prevention initiative implemented in the city during the past few years. Using a quasi-experimental design with a pre/post test, we collected data from interviews with CCTV operators, police officers and local authority officials and from surveys of pedestrians and shopkeepers. The team also examined police crime data and CCTV incident data. In this paper we will discuss the results in terms of the following three hypotheses relating to crime reduction, displacement and public security: (a) the use of cameras reduces the levels of crime, whether property crime (robberies and burglaries), crimes against people or both; (b) some of those crimes are displaced to nearby areas within or close to the city centre where there is no camera coverage but where there are similar opportunities to commit crimes; and (c) people claim to feel safer in the city centre after dark after the cameras were introduced.

“Does CCTV Displace Crime?” Waples, Sam; Gil, Martin; Fisher, Peter. Criminology and Criminal Justice , May 2009, Vol. 9, No. 2, 207-224. doi: 10.1177/1748895809102554.

Abstract: “Crime displacement is a concern often raised regarding situational crime prevention measures. A national evaluation of closed circuit television cameras (CCTV) has provided an interesting test-bed for displacement research. A number of methods have been used to investigate displacement, in particular visualization techniques making use of geographical information systems (GIS) have been introduced to the identification of spatial displacement. Results concur with current literature in that spatial displacement of crime does occur, but it was only detected infrequently. Spatial displacement is found not to occur uniformly across offence type or space, notably the most evident spatial displacement was actually found to be occurring within target areas themselves.”

“Measuring the Crime Displacement and Diffusion of Benefit Effects of Open-street CCTV in South Korea” Park, Hyeon Ho; Oh, Gyeong Seok; Paek, Seung Yeop. International Journal of Law, Crime and Justice , September 2012, Vol. 40, Issue 3, 179-191. doi: 10.1016/j.ijlcj.2012.03.003.

Abstract: “Along with CCTV’s perceived high expectations as crime deterrent, there is also a growing controversy over CCTV’s potentially unexpected limitations. For example, the crime displacement (the presence of CCTV will change the locations of crime and its total number will not change) and the diffusion effects of crime control benefits (the crime prevention effect of CCTV may filter through to neighboring areas) are the representative controversial issues. In this study, we aimed to verify the crime displacement and the diffusion of benefit of open-street CCTV by analyzing the crime tendencies empirically…. The results [of this study] showed that the crime prevention effect of the CCTV was significant. The number of robberies and thefts in the areas with CCTV installed reduced by 47.4%, while the areas without CCTV showed practically no change in the number of crimes. The crime displacement caused by the CCTV was not either found or inconsequential and the crime rates in the neighboring areas also decreased slightly.”

“Suspiciousness Perception in Dynamic Scenes: A Comparison of CCTV Operators and Novices” Howard, Christina J.; et al. Frontiers in Human Neuroscience , August 2013. doi:  10.3389/fnhum.2013.00441.

Abstract: “How attention is used to perceive and evaluate dynamic, realistic scenes is more poorly understood… We investigated these issues when an observer has a specific, and naturalistic, task: closed-circuit television (CCTV) monitoring. We concurrently recorded eye movements and ratings of perceived suspiciousness as different observers watched the same set of clips from real CCTV footage. Trained CCTV operators showed greater consistency in fixation location and greater consistency in suspiciousness judgments than untrained observers. Training appears to increase between-operators consistency by learning ‘knowing what to look for’ in these scenes.”

“A Prosperous ‘Business’: The Success of CCTV through the Eyes of International Literature” Séverine, Germain. Surveillance & Society , 2013, Vol. 11 Issue 1/2, 134.

Abstract: “This article deals with a paradox: Video surveillance becomes widespread, in more and more numerous social and national spaces, while its effects in terms of crime prevention and/or law enforcement and community reassurance are not demonstrated. Through a critical analysis of the international literature on CCTV, this article attempts to identify the reasons advanced to explain the ‘success’ of this technology. Three kinds of approaches, which embody three ways of defining the political and social impact of CCTV, can be distinguished: Surveillance studies, impact analyses and use studies. This paper discusses these works and the answers they bring to the understanding of CCTV development. It claims that micro-level case study analysis allows us to grasp subtly the locally observable mechanisms by which new actors can be enrolled in the device and new legitimizations are made possible.”

Keywords: crime, public safety, CCTV, surveillance, prevention, policing, research roundup, policing

About the Authors

' src=

Leighton Walter Kille

' src=

Martin Maximino

Home — Essay Samples — Government & Politics — Government Surveillance

one px

Essays on Government Surveillance

Embark on an in-depth exploration of government surveillance with our collection of essay samples. As a pivotal issue at the intersection of technology, law, and ethics, government surveillance offers a rich field of study for students across disciplines. These essays serve as exemplary models, dissecting the multifaceted arguments surrounding privacy rights, national security, and the balance of power.

Government Surveillance: A Multidimensional Perspective

Our essays on government surveillance delve into historical contexts, legal frameworks, and ethical considerations. Students can gain insights into the evolution of surveillance technologies, the legal battles over privacy rights, and the ethical dilemmas posed by state monitoring. These samples provide a comprehensive overview of the debates that define the discourse on government surveillance.

Analyzing the Impact of Surveillance on Society

The collection includes essays that evaluate the broader societal impacts of government surveillance . From the chilling effects on free speech to the implications for democratic freedoms, these essays encourage critical analysis of how surveillance shapes individuals' lives and society at large. Students are invited to engage with complex questions about the trade-offs between security and liberty.

Exploring Global Perspectives on Surveillance

Recognizing the global nature of government surveillance, our essays also feature comparative analyses of surveillance practices around the world. This global perspective enriches students' understanding of how different legal systems and cultural values influence the implementation and perception of surveillance.

A Resource for Critical Thinking and Academic Inquiry

Designed to support students in their academic endeavors, this collection of government surveillance essay samples is an invaluable resource for research, writing, and debate. By presenting well-crafted arguments and diverse viewpoints, these essays inspire deeper inquiry into one of the most pressing issues of our time.

The Effects on Online Surveillance by Government

Government surveillance in 1984 by george orwell: bogus security, made-to-order essay as fast as you need it.

Each essay is customized to cater to your unique preferences

+ experts online

Government Vigilance on Internet Activities Violating Constitutional Freedom in The Us

The ethics and appropriate use of government surveillance, why government surveillance is overall unethical, the drawbacks of government surveillance, let us write you an essay from scratch.

  • 450+ experts on 30 subjects ready to help
  • Custom essay delivered in as few as 3 hours

Government Surveillance and Deceit Throughout The World History

The moral dilemma around government surveillance on the internet in the united states, introducing the role of government in security of the nation, the inconveniences of government scope inside the area of transactions and concessions, get a personalized essay in under 3 hours.

Expert-written essays crafted with your exact needs in mind

The Governmantal System in Haiti

Georgia’s recent political history, the association of southeast asian nations, implementation of social protection programs in a bid to improve the people's standards of living, the lack of privacy in america, discussion about the role of government in economy , apple vs fbi: the ethics of accessing personal data, the purpose of government: safeguarding and serving, government and internet, relevant topics.

  • European Union
  • Foreign Policy
  • Gentrification
  • Homeland Security
  • International Relations
  • Philippine Government
  • United Nations
  • Student Government
  • Corporate Governance

By clicking “Check Writers’ Offers”, you agree to our terms of service and privacy policy . We’ll occasionally send you promo and account related email

No need to pay just yet!

We use cookies to personalyze your web-site experience. By continuing we’ll assume you board with our cookie policy .

  • Instructions Followed To The Letter
  • Deadlines Met At Every Stage
  • Unique And Plagiarism Free

research paper on government surveillance

COMMENTS

  1. Government Surveillance, Privacy, and Legitimacy

    The recent decades have seen established liberal democracies expand their surveillance capacities on a massive scale. This article explores what is problematic about government surveillance by democracies. It proceeds by distinguishing three potential sources of concern: (1) the concern that governments diminish citizens' privacy by collecting their data, (2) the concern that they diminish ...

  2. Government Surveillance, Privacy, and Legitimacy

    Government surveillance and the erosion of privacy it is associated with are being. discussed as a cause of distrust and feelings of vulnerability, as a potential source. of discrimination and ...

  3. Government Surveillance and Internet Search Behavior

    The surveillance revelations are treated as an exogenous shock in information about how closely users' internet searches were being monitored by the US government. Each search term was independently rated for its degree of privacy sensitivity along multiple dimensions. Using panel data, our results suggest that search terms that were deemed ...

  4. National Security, Mass Surveillance, and Citizen Rights under

    surveillance, in the interest of national security, to protect the homeland and American. citizens under conditions of protracted conflict, is essentially and practically compatible. with the practice, values and regime of democracy and human rights, as professed by. America.

  5. The Dangers of Surveillance

    Surveillance menaces intellectual privacy and increases the risk of blackmail, coercion, and discrimination; accordingly, we must recognize surveillance as a harm in constitutional standing doctrine. Explaining the harms of surveillance in a doctrinally sensitive way is essential if we want to avoid sacrificing our vital civil liberties.

  6. Full article: Privacy Regimes, Crisis Strategies, and Governments

    Introduction. Looking at the past and looking at the present, the trajectory of government surveillance is unmistakable: it is increasing (Yates & Whitford, Citation 2023).Observers frequently note that the pace of development accelerates during times of crisis, as crises can function as windows of opportunity for governments to introduce (often controversial) surveillance policies (Boersma ...

  7. How Americans have viewed government surveillance ...

    Roughly half of Americans (49%) said their personal data were less secure compared with five years prior, according to the 2016 survey. The Snowden revelations were followed in the ensuing months and years with accounts of major data breaches affecting the government and commercial firms. These vulnerabilities appear to have taken a toll.

  8. Full article: Balancing privacy rights and surveillance analytics: a

    Developing a decision guide to help those obligated to evaluate innovative surveillance analytics begins with defining and analysing key concepts and terms. This section defines the four concepts of balance, privacy and privacy rights, surveillance analytics, and ethical decision-making. 2.1. Balance and balancing.

  9. PDF Government Surveillance, Privacy, and Legitimacy

    of government surveillance. My goal in this paper is to contribute a new perspective on what is ethically at stake when democratic governments monitor their citizens and to achieve a better understanding of what is pro tanto objectionable about it.10 I will proceed by distin-guishing three independent concerns that a critic of government ...

  10. Surveillance, Snowden, and Big Data: Capacities, consequences, critique

    The Snowden revelations about National Security Agency (NSA) surveillance, starting in June 2013, along with the ambiguous complicity of internet companies and the international controversies that followed illustrate perfectly the ways that Big Data has a supportive relationship with surveillance. Words such as "bulk data" and "dragnet ...

  11. Public Perceptions of Privacy and Security in the Post-Snowden Era

    About this Report. This report is the first in a series of studies that examines Americans' privacy perceptions and behaviors following the revelations about U.S. government surveillance programs by government contractor Edward Snowden that began in June of 2013. To examine this topic in depth and over an extended period of time, the Pew ...

  12. Police surveillance and facial recognition: Why data privacy is

    We also explain why U.S. federal privacy legislation is necessary to govern how private sector companies implement fairness in the technical development process, limit their data collection and ...

  13. The effectiveness of surveillance technology: What intelligence

    about the effectiveness of surveillance technology. Because so much surrounding surveillance technology is controversial, how it is discussed matters. The paper proceeds as follows: after briefly addressing related work, terminology is defined and the research methods of this study are described. Thereafter what

  14. Government Surveillance Research Papers

    With this, the necessity of government surveillance over the use of social media by individuals should be seen as a priority. Although government surveillance on individual social activities may reflect dictating tendencies and contradictions; it is, to a large extent, necessary to maintain societal and personal security.

  15. Spyware and surveillance: Threats to privacy and human rights growing

    GENEVA (16 September 2022) - People's right to privacy is coming under ever greater pressure from the use of modern networked digital technologies whose features make them formidable tools for surveillance, control and oppression, a new UN report has warned. This makes it all the more essential that these technologies are reined in by effective regulation based on international human ...

  16. PDF Government Surveillance, Privacy, and Legitimacy

    Government surveillance and the erosion of privacy it is associated with are being discussed as a cause of distrust and feelings of vulnerability, as a potential source of discrimination and unjust domination, and as a threat to democracy and the integrity of the public sphere, to name but a few concerns.

  17. PDF Guiding Principles on Government Use of Surveillance Technologies

    Principles: The following principles are intended to guide the responsible use of surveillance technology to prevent the misuse in the three aforementioned areas of concern. Some states may choose to implement these principles in other areas as well. Appropriate Legal Protections: The use of surveillance technologies should be carried out in ...

  18. The Power of Public Health Surveillance

    Public Health Surveillance. Public health surveillance is the ongoing systematic collection, analysis, and interpretation of health-related data essential to planning, implementation, and evaluation of public health practice, closely integrated with the timely dissemination of these data to those responsible for prevention and control.

  19. Government Surveillance: A Question Wording Experiment

    To better understand how the manner in which the government's surveillance program is described affects public evaluations, the Pew Research Center conducted a question wording experiment in a national telephone survey fielded between July 11 and 21, 2013 among 2,002 adults. The survey respondents were asked whether they would favor or oppose ...

  20. The Ethics (or not) of Massive Government Surveillance

    Introduction to Surveillance. Surveillance is, simply put, the observation and/or monitoring of a person. Coming from the French word for "looking upon," the term encompasses not only visual observation but also the scrutiny of all behavior, speech, and actions. Prominent examples of surveillance include surveillance cameras, wiretaps, GPS ...

  21. The effect of CCTV on public safety: Research roundup

    Surveillance & Society, 2013, Vol. 11 Issue 1/2, 134. Abstract: "This article deals with a paradox: Video surveillance becomes widespread, in more and more numerous social and national spaces, while its effects in terms of crime prevention and/or law enforcement and community reassurance are not demonstrated. Through a critical analysis of ...

  22. ≡Essays on Government Surveillance. Free Examples of Research Paper

    Government Surveillance in 1984 by George Orwell: Bogus Security. 2 pages / 814 words. In George Orwell's novel 1984 Big Brother controls the population of Oceania through many ways. One of these ways is surveillance; monitoring everyone's every move. This instills fear in the people in their everyday lives to carry out their moves and ...

  23. Research Paper On Government Surveillance

    Research Paper on Government Surveillance - Free download as PDF File (.pdf), Text File (.txt) or read online for free. research paper on government surveillance