Protest is a fundamental feature of democracy, yet protesters have been continuously met with domestic surveillance mechanisms intended to chill public free expression and criminalize lawful behavior. The deployment of privacy-invasive measures against protesters in public spaces has an extensive and storied history—one often rooted in racism and discrimination. And while the technology may have changed, the purpose remains the same: to surveil, identify, and detain dissenters.
This Article discusses the disparity between the surveillance technologies available to protesters and the government, as well as the limits of proposed efforts intended to regulate government surveillance and hold law enforcement accountable. While surveillance mechanisms are often defended as a means for providing equal accountability for everyone, not everyone is truly surveilled equally. Automated law enforcement technologies like those we discuss below, including (I) body-worn cameras, (II) license plate readers, (III) cell-site location services, (IV) drones, and (V) facial recognition technologies, are commonplace means to surveil protesters; yet, the only mechanisms available to protesters to protect their privacy or countersurveil law enforcement is often (VI) a smartphone camera and the power of social media shaming. Although protesters have developed inventive ways to try to shield themselves from unwarranted government intrusion, these new methods do not compare to the sheer number of surveillance options available to law enforcement.
I. Body-Worn Cameras
Anonymous speech is an important feature of public protests; however this anonymity is challenged when footage captured by body-worn cameras is used to identify and target certain protesters, particularly when such footage is coupled with facial recognition technology. As of 2018, 58 percent of all law enforcement departments in the United States had adopted the practice of wearing body-worn cameras, a marked increase from the 25 percent of law enforcement departments that utilized such cameras in 2013. Though initially intended as an accountability mechanism for law enforcement, the footage from body-worn cameras is largely controlled by government officials, including law enforcement departments and local prosecutors. Body camera footage is increasingly used to garner evidence against civilians, instead of its intended application for capturing law enforcement use of force. Furthermore, the presence of body-worn cameras does little to prevent law enforcement from using excessive force during civilian interactions and rarely captures a direct view of the commission of a crime. A study of Washington, D.C.’s Metropolitan Police Department suggested that police officers equipped with body-worn cameras are as likely to use force as those without them.
There are widespread issues with body-worn cameras being deployed for discriminatory purposes in certain neighborhoods, for certain protests, or against certain defendants. Prosecutors have a long history of hiding or refusing to turn over exculpatory evidence, and there is no formal requirement forcing law enforcement to turn over footage during the plea bargaining process, prior to a criminal trial. For the 95 percent of criminal defendants who decide to accept a plea deal, footage captured by the arresting law enforcement officer may never be made available. Given historical distrust between police and criminal defendants, skepticism of the viability of the footage, and lack of prosecutorial oversight for mandating the disclosure of exculpatory evidence, accused civilians may never be able to utilize body-worn camera footage to prove their own claims—particularly those related to a law enforcement officer’s excessive use of force.
This false promise of accountability from body-worn cameras is additionally exacerbated by law enforcement attempts to obscure and conceal footage. A union representing 24,000 New York City police officers sued the New York City Police Department (NYPD) in 2018 to prevent the public release of body camera footage without a court order, and victims of police abuse in New York were forced to countersue the NYPD for access. According to an internal memo, in May 2020, New York City’s Civilian Complaint Review Board requested body-worn camera footage for 212 cases involving possible misconduct from the NYPD but received only 33 responses. If body-worn cameras do not limit the use of force and fail to aid accountability efforts, it is understandable that protesters may find little comfort in spotting law enforcement officers equipped with the technology. Furthermore, pairing body-worn cameras with facial recognition technology raises serious First and Fourth Amendment concerns. Citing this reason, California recently took a monumental step forward by placing a moratorium on the combined use of the two technologies.
At the federal level, the Justice in Policing Act, which was introduced in June 2020 and supported by 200 members of Congress, promises to “improve accountability and transparency of use of force by law enforcement officers, assist in responding to complaints against law enforcement officers, and improve evidence collection.” While the bill has numerous positive benefits, including prohibiting the problematic practice of pairing body-worn camera footage with facial recognition software, it would also increase grant funding for body-worn camera programs. Increased funding for the use of body-worn cameras fails to account for the ways in which these cameras are implemented today: not as an accountability mechanism, but as one more way to garner evidence against civilians and chill free speech. Legislators must pay more attention to the issue of whether body-worn cameras truly serve as a means of holding law enforcement accountable and whether body-worn camera programs should continue to be funded through public grants. Merely relying on assumptions about the efficacy of these devices will not serve us well. If we truly want to hold law enforcement accountable, we should turn the cameras toward them—not away.
II. License Plate Readers
Law enforcement’s use of automated license plate readers can also do a great deal of harm to protester anonymity. License plate readers photograph vehicles and automatically match license plate numbers to the license numbers and vehicle ownership information within law enforcement databases. The coronavirus, or COVID-19, pandemic has heightened the privacy implications and concerns regarding use of these readers, as many protesters are electing to drive during protests rather than march. Furthermore, databases for license plate records are ripe for security breaches considering that database entries are rarely, if ever, deleted and these databases continue to lack proper cybersecurity measures.
Following the killing of George Floyd by law enforcement officers in Minneapolis, Minnesota, Black Lives Matter protests spread across the United States and the world. City council members in Minneapolis responded by committing to defund the city’s police, who currently use two automated technologies: the shotspotter—an automated listening device that can identify gunshots—and automated license plate readers. Minneapolis’s practice of automated license plate readers is regulated by Minnesota statute, and the city notes that the data collected are subject to a biennial audit to ensure compliance with retention, use, and sharing requirements. Audits, however, have proven of little use to other cities, including Los Angeles. In a 2020 audit of the Los Angeles Police Department, 99.9 percent of the 320 million images stored came from vehicles not associated with a criminal investigation.
Databases of license plate images are subjected to little oversight and the information they contain can reveal incredibly sensitive details about individuals, including where they work, where they live, and what they protest. According to a 2009 report issued by the International Association of Chiefs of Police, mobile license plate reader units can read and collect the license plate numbers of vehicles parked near the staging areas for political protests. The Association notes that one significant risk of using automated license plate readers is that individuals will become more cautious in the exercise of their protected rights of expression, protest, association, and political participation because they consider themselves under constant surveillance. Automated license plate readers—potentially unfamiliar to protesters—are underregulated and unaudited, as evidenced by the LA Police Department audit. And these systems have the capacity to do incredible harm, particularly for individuals participating in driving protests during the pandemic or driving to and from a protest location.
III. Cell-Site Location Information
A cell phone’s location—whether it is stationary or moving—can be tracked through cell-site location information or global positioning system data. Law enforcement officers can use this information to determine which individuals have frequented a protest and follow that individual’s exact movements. Officers can garner cell-site location information through a warrant, buy the data from an aggregator, or collect it via a cell-site simulator. Also known as stingrays or dirtboxes, cell-site simulators prompt phones to connect to a fake cell tower, revealing their location.
The collection and use of cell-site data has more legal precedent than some of the other technologies mentioned in this Article. In 2018, the Supreme Court held in Carpenter v. United States that law enforcement officers need to obtain warrants in order to gain access to more than seven days’ worth of cell-site location information from third parties, including mobile carriers. Generally, search warrants are executed pursuant to probable cause and tied to a specific suspect or address. Yet law enforcement officers have increasingly used geofence warrants, which do not garner information on specified individuals but information on numerous persons within a specific area.
Geofence warrants mean that every protester in possession of a cell phone with enabled geolocation capabilities can be included in a mass-scale dragnet of location data and other personal identifiers (IP addresses, cell numbers, names, etc.) that can be sent to law enforcement. Protesters might not ever know that their data was collected through such a search. The potential for identifying protesters and other individuals in the general vicinity through geofence warrants subjects a large number of civilians to governmental surveillance. New York State Senator Zellnor Myrie recently proposed Senate Bill S8183, which would prohibit law enforcement searches—with or without a warrant—of “geolocation data of a group of people who are under no individual suspicion of having committed a crime, but rather are defined by having been at a given location at a given time” within the state of New York. While proposals like these would go a long way toward ensuring protester privacy from warrantless geolocation searches, this state bill is not yet a law, and there is currently no federal law prohibiting the use of cell-site location information to trace a protester’s locations.
Researchers have recently demonstrated the potential to discern trends in protest size using aggregate cell-site location information datasets from third-party data brokers. These data brokers can also sell their data and insights to law enforcement, immigration enforcement, and whomever else will pay. One data broker report, released in May 2020, analyzed data from 16,902 devices at protests, revealing sensitive data like the age, race, and gender of protesters. While aggregate historical location data may prove beneficial for research regarding the popularity of certain protests and more, real-time individual-level data tracking of protester’s movements is highly sensitive. An individual’s cell-site location information can reveal incredibly intimate details about them, including where they live and where they work. Furthermore, a person’s political leaning can be inferred on the basis of the rallies they attend and the protests they frequent. Whether garnered through a geofence warrant or bought from data brokers, such intimate information on protesters should not be used by law enforcement to repress protesters’ rights.
Law enforcement use of drones to surveil protesters comes with its own host of ethical and privacy concerns. While drones pose unique legal questions, there are numerous similarities between drone data collection and the collection and retention of cell-site location information. In a recent case concerning the use of drones as tools for government surveillance, a Maryland district court denied the plaintiff’s request to prevent the Baltimore Police Department from implementing an aerial surveillance operation called the Aerial Investigation Research (AIR) pilot program. According to the district court, the AIR program consists of flying three aircrafts operated by a private contractor over Baltimore for a period of six months. Each plane would fly approximately 12 hours a day, collectively covering “about 90 percent of the [c]ity” and “capturing about 32 square miles of the city per image every second.”
The court distinguished the city’s drone program from the cell-site location information at issue in Carpenter and determined that the cell-site data in that case “offer[s] a far more intrusive, efficient, and reliable method of tracking a person’s whereabouts than the AIR pilot program.” The court noted that while a cell phone generally relays its location data “several times a minute,” the AIR program drones cannot fly at night or in bad weather, which creates “gaps in the data [that] will prohibit the tracking of individuals over the course of multiple days.” However, the American Civil Liberties Union (ACLU) argues that Carpenter upheld a Fourth Amendment claim inclusive of gaps in cell-site location data, such as when people “turn off their cell phones, leave them at home, or travel out of their service provider’s coverage area.” The ACLU’s argument is effectively that the aerial program and cell-site location information garner large amounts of data in a continual manner, qualifying both programs as warrantless surveillance. The ACLU has appealed the district court’s decision to the U.S. Court of Appeals for the Fourth Circuit, and the court of appeals has agreed to expedite the proceedings.
In addition to aerial surveillance, law enforcement use of drones could expand into actual intervention on the ground. For example, airborne technologies could potentially be developed with the intent to control or dispel protesters—perhaps by deploying tear gas or other means of crowd suppression. In May 2020, protesters in Minneapolis saw a predator drone deployed by Customs and Border Protection (CBP) appear in the sky above them. According to CBP, the drone was deployed “at the request of our federal law enforcement partners in Minneapolis.” Though this incident is the subject of an ongoing investigation instigated by the U.S. House of Representatives Subcommittee on National Security, it is unlikely to be the last instance in which the government employs a military-grade drone for domestic law enforcement purposes. While law enforcement officers employ predator drones intended for military operations, civilian drone use is heavily regulated, limiting the ability of protesters to utilize the technology themselves. Drones, whether weaponized or used as a means of surveillance, should not be used to catalogue the movements and identities of protesters.
V. Facial Recognition Technologies
Facial verification and recognition technologies have become common-place in everyday life, from unlocking mobile devices to reverse image searches online. However, a more concerning application of these technologies is the implementation of facial recognition software by state actors and law enforcement agencies to identify and characterize potential or suspected offenders. In the recent Black Lives Matter protests, various city police departments and the Federal Bureau of Investigation (FBI) have openly requested that the public share images and videos of protesters. This is with the intention of applying facial recognition algorithms for comparison to footage from body-worn cameras, as well as image identification in various databases.
Facial recognition technologies are merely the newest form of encoding visible markers as a mechanism for classification, but the algorithms at their core remain flawed. According to a 2019 National Institute of Standards and Technology study of 189 facial recognition algorithms from 99 developers, algorithms had higher rates of false positives for female faces relative to male faces, Asian and African American faces relative to those of Caucasians, and faces of African American women overall. The concept of a false positive may not seem harmful, but when considered in legal settings, a false positive carries the potential to implicate innocent individuals, undermining the foundations of democratic society and the promise of innocence until proven otherwise—as was the recent case of a man unjustly convicted on the basis of a flawed facial recognition system. Utilizing these algorithms are yet another instance where agents of authority believe the ends justify the means. Such an imbalanced power structure is particularly worrisome, as half of American adults are currently in a law enforcement facial recognition network.
Law enforcement use of facial recognition technology against protesters also highlights the degree of subjectivity allowed in its application, permitting officers to decide which protesters are surveilled and which are not. During the protests for Freddie Gray in 2016, the Baltimore Police Department used real-time maps of protesters’ social media logins to specifically identify Black Lives Matter protesters—and in some cases using facial recognition to identify protesters with outstanding warrants. Law enforcement can access a number of databases with large collections of facial images, including databases of mugshots and driver’s license photos. However, none are as large or as invasive as those maintained by private technology companies, particularly Clearview AI.
Clearview AI’s facial images repository holds more than 3 billion images. At least 600 different law enforcement agencies use the company’s proprietary algorithm for facial recognition, which the company provides to large institutions for profit. According to Clearview AI’s website, the technology is an “after-the-fact” research tool: similar to unregulated law enforcement searches of public genealogy databases with comparison samples, officers merely upload a crime scene image and allow the algorithm to detect matches amongst photos that were scraped from publicly available websites, often in violation of terms of service. It is important to note that while Clearview AI’s collection of photos is assembled from those that are publicly available, users have not consented to their inclusion in the company’s database, and this is especially problematic for individuals whose photos were publicly available without their knowledge or consent.
Private companies can hold, or at least attempt to hold, Clearview AI responsible for violating terms of service with their scraping methods of collection. However, Clearview AI has thus far ignored cease and desist letters issued by these companies, arguing that it has a First Amendment right to harvest public information. Despite Clearview AI’s stance, no court has held that the First Amendment guarantees a constitutional right to repurpose publicly available data for facial recognition. Members of Congress have directly inquired about law enforcement use of Clearview AI’s services within the scope of the ongoing protests against racial injustice, but concerns about the technology persist.
Given concerns around disparate impacts, some companies have already taken the initiative to withdraw and all-together abstain from providing facial recognition services to law enforcement agencies. Amazon, whose Rekognition software faced its own public backlash for discrimination issues, is implementing a one-year moratorium on law enforcement use of its technology to give Congress time to create regulatory legislation. Others went further: Microsoft, for instance, stated that they would not sell their facial recognition technology to police until a national law “grounded in human rights” was in place, and IBM abandoned development of the technology all-together due to the risks of abuse and misuse.
While private companies’ attempts to self-regulate are admirable, in order to achieve lasting protections, the legislative branch must codify how facial recognition technology is developed, distributed, and implemented. Already, a handful of states have initiated attempts to regulate the space. Illinois, for example, passed the Biometric Information Privacy Act of 2008, which set up a comprehensive framework restricting how private entities collect, use, and share biometric information and biometric identifiers under specific security requirements. Under the Act, a private entity must follow distinct obligations for biometric information based on an individual’s biometric identifier (retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry). Based on clear notice and with a focus on individual rights, this framework could provide a model for federal law.
At present, however, there is no federal law that addresses biometric technologies, or their impact on civil rights, despite vocal calls for a ban on law enforcement use of facial recognition and state legislation attempting to address the issues posed by the technology. The Justice in Policing Act would attempt to regulate law enforcement use of facial recognition technologies on a federal level by limiting use and requiring the completion of a study examining the impact of the technology on an individual’s constitutional rights. Sponsors of the bill note that it is the first-ever comprehensive approach to hold police accountable, address the ramifications of racial profiling, change the culture of law enforcement, empower communities, and build trust between law enforcement officers and our communities by addressing systemic racism and bias. However, what is actually most needed for automated technologies is regulation and development crafted with privacy and equal accountability as inherent aspects of the process. Furthermore, while training programs and prohibitions on discrimination are admirable goals, we question whether any one regulation can solve the systemic issues associated with race-based bias, algorithmic or otherwise, while systemic cultural and societal components remain unaddressed or even unacknowledged.
VI. Counter-Surveillance Technologies Available to Protesters
Current regulations and social norms favor the use of surveillance technologies by law enforcement, while disfavoring the ability of protesters to surveil and identify police officers. As a result of this power inequity, protesters have developed inventive ways to try to protect themselves from unwanted surveillance, thwart intrusions into their privacy, and preserve their rights to protest with anonymity.
The Supreme Court has ruled that the right to anonymous free speech is protected by the First Amendment, noting that “[a]nonymity is a shield from the tyranny of the majority.” Protesters have taken principles of anonymity to heart, employing privacy-protective measures like sunglasses, face masks, and privacy-preserving makeup to distort facial recognition in surveillance footage or social media photos and videos. These techniques can reduce the likelihood of identification, although identification can still occur at a far lower rate.
In addition to protecting their anonymity, protesters have attempted to countersurveil law enforcement, including monitoring police scanners and recording officers. Perhaps the most instrumental tool for a protester is the smartphone. The use of individualized camera footage as “sousveillance”—or discrete monitoring of the authorities—has become a popular method of accountability for police interactions. Recently, a viral video quickly contradicted a statement issued by Buffalo Police Department that a 75-year-old protester “was injured when he tripped and fell;” instead, protestor-taken video footage clearly showed two police officers pushing that man before he fell and hit his head.
Even though it is technically legal to record law enforcement officers and officials acting in a public capacity, individuals who do so may still face consequences. For instance, law enforcement may exploit their elevated power status by intimidating protesters into surrendering their mobile devices, and the photos and videos on these devices can unfortunately be used against protesters. Content uploaded to social media may contain metadata, such as time and location information, that can help law enforcement track specific individuals and crowd movements. Live streaming can also be perilous for those who may inadvertently appear in the background of such recordings. Projects like ACLU’s #ProtectBlackDissent hope to remedy issues of the disparate effects surveillance by filing Freedom of Information Act lawsuits against the FBI and campaigning for greater transparency around law enforcement surveillance of protesters. Meanwhile, messaging apps like Signal have introduced consumer features like blur tools that allow individuals to blur faces before sharing images on social media. However, the goals of such projects and features are complicated when the most invasive surveillance mechanisms possible are deployed against protesters.
Much of the political activism today is coordinated online using various communication and social media platforms, and those that support end-to-end encrypted messaging potentially provide a level of protection against interception from law enforcement. However, some of these platforms and messaging services are tied to or maintained by providers who willingly share metadata—as well as content, in some cases—with law enforcement without legal process. Social media monitoring, including law enforcement’s searching for hashtags and events, can lead to law enforcement labeling protesters as “threat actors,” as well as infiltration of private social media groups and greater law enforcement presence at demonstrations. Protesters abroad are protecting themselves against law enforcement monitoring by using Bluetooth-based messaging apps that keep data on their devices, disallowing law enforcement’s intrusion into peer-to-peer messages. In an effort to evade location tracking, protesters can manually disable geolocation tracking capabilities to help limit some of those issues as well as disable biometric validators so that their mobile devices remain encrypted should they be confiscated.
In addition to these grassroots efforts, some activists are proactively trying to level the playing field. In one instance, a group in New York City is attempting to create an archive of traffic camera footage to hold law enforcement accountable (the camera footage is traditionally provided to the public only as a real-time feed). Hacktivists, those who commandeer computer networks to spread political or social messages, have also engaged in civil disobedience online, as recently exemplified with Anonymous’ hack of Chicago police radios to play NWA’s ‘Fuck Tha Police.’ However, benevolent hacktivism is often equivocated with malicious hacking, which carries harsher criminal penalties under the Computer Fraud and Abuse Act, compared to charges of inciting others to riot or inciting others to violate stay-at-home orders in the public sphere. Thus, even if protesters adopted law enforcement’s tactics for countersurveillance and activism, they potentially face harsh punishments related to their use of technology.
While privacy-preserving means of protesting are shared within communities and across the media, law enforcement tactics are always going to outweigh protesters’ efforts due to a single tool that has too long shielded law enforcement misconduct from reform: qualified immunity. Qualified immunity, the doctrine protecting government officials from being found personally liable for constitutional violations for monetary damages, is overly broad as related to law enforcement overreach and use of excessive force. as qualified immunity Qualified immunity may allow for “ignorance” of the law or “reasonable” mistakes and often requires civilians to fight against insurmountable odds to prove violations beyond dispute. States and local governments may want to follow Colorado’s lead, where, pursuant to a recently passed bill, law enforcement officers can be held financially liable if they are found guilty of violating an individual’s civil rights. Videos of law enforcement use of force may have alerted the media to police brutality, but accountability efforts have done little to fundamentally change how law enforcement officers interact with the public.
As digital surveillance technologies advance, so too does the risk for targeted repression. Communities of color already face higher likelihoods of poorer health outcomes and financial hardships due to the COVID-19 pandemic. Now they must incur higher risks of victimization while advocating for an end to systemic racial injustice, which has been spurred by instances of fatal law enforcement use of force against minorities. Law enforcement has a responsibility to facilitate the rights of civilians, including the First Amendment right to peaceful protest—not to quash expression of those rights. Those who protest and advocate for equality and justice should not have to risk being the victim of the extensive reach of the state while merely exercising their constitutionally mandated rights.
Katelyn Ringrose, JD: Christopher Wolf Diversity Law fellow at the Future of Privacy Forum. Her publications have focused on the Fourth Amendment, genetic privacy, body-worn cameras, and more. The views herein do not necessarily reflect those of Future of Privacy Forum supporters or board members.
Divya Ramjee: doctoral candidate at American University in the Department of Justice, Law & Criminology. Her research interests include cybercrime and intellectual property crime, biotechnology, and the application of machine learning methodology in the field of criminology. This author is currently employed by the U.S. Department of Justice, and the views expressed in this article are exclusively those of the author and do not necessarily represent the views of the U.S. Department of Justice, its components, or the United States.
The authors thank the following individuals for their advice and recommendations: Imran Malek, JD; Anisha Reddy, JD; Dr. Sara Jordan, PhD; Brenda Leong, JD; Dr. Eric Bechter, PhD; and Jessica Malaty Rivera, MS. Finally, thank you to Alissa Gutierrez, JD candidate, for editorial assistance.
 Katelyn Ringrose, Law Enforcement’s Pairing of Facial Recognition Technology with Body-Worn Cameras Escalates Privacy Concerns, Univ. Il. L. REV. 1 (2019), https://illinoislawreview.org/wp-content/uploads/2019/03/Ringrose-1.pdf [https://perma.cc/DG78-NV5E] (religious and racial minorities that have been affected by government monitoring include law enforcement officials in the deep south forcing Black slaves to carry lanterns at night to prevent their escape from White-owned plantations; as well as the Federal Bureau of Investigation’s monitoring and attempted delegitimization of Reverend Martin Luther King, Jr. and other members of the Black clergy).
 Jacob J. Hutt, Is the Government Planning to Surveil Keystone XL Protesters? ACLU (Sep. 4, 2018), https://www.aclu.org/blog/free-speech/rights-protesters/government-planning-surveil-keystone-xl-protesters [https://perma.cc/B3CX-PX8J].
 Molly K. McGinley et al., The Biometric Bandwagon Rolls on: Biometric Legislation Proposed Across the United States, THE NAT’L L. Rev. (Mar. 25, 2019), https://www.natlawreview.com/article/biometric-bandwagon-rolls-biometric-legislation-proposed-across-united-states#:~:text=Currently%2C%20three%20states%2C%20Illinois%20%5B,effect%20on%20January%201%2C%202020 [https://perma.cc/4EX3-HGZY].
 Alvaro M. Bedoya, The Color of Surveillance, SLATE (Jan. 18, 2018), https://slate.com/technology/2016/01/what-the-fbis-surveillance-of-martin-luther-king-says-about-modern-spying.html [https://perma.cc/6DDV-BD2F] (the modern spy era cannot be discussed without understanding surveillance as a pervasive means of monitoring Black individuals, and enforcing racist notions of race-based criminogeniology: “Across our history and to this day, people of color have been the disproportionate victims of unjust surveillance; Hoover was no aberration. And while racism has played its ugly part, the justification for this monitoring was the same we hear today: national security.”); see also, Lindsay Beyerstein, Tracking Blackness: A Q&A with Dark Matters Author Simone Browne, Political Research Associates (June 7, 2016), http://www.politicalresearch.org/2016/06/07/tracking-blackness-a-qa-with-dark-matters-author-simone-browne#sthash.Q72kGxUr.f0xpQXso.dpbs [https://perma.cc/73C3-VJG3] (noting that slave branding served as a precursor to biometric identifiers used today).
 Emily Hong, Hands Up, Don’t Film: Body Worn Cameras and Protests, New America (Oct. 9, 2015), https://www.newamerica.org/oti/blog/hands-up-dont-film-body-worn-cameras-and-protests/ [https://perma.cc/L4ZN-PLEZ]; Jason Leopold, Emails Show Feds Have Monitored ‘Professional Protester’ DeRay Mckesson, Vice (Aug. 11, 2015), https://www.vice.com/en_us/article/qv58n3/emails-show-feds-have-monitored-professional-protester-deray-mckesson [https://perma.cc/BND9-PQY8] (emails leaked in 2015 from the Department of Homeland Security revealed that the federal government monitored the activities of Black Lives Matter organizers).
 Overview of Body-Worn Camera Use by Law Enforcement, National Institute of Justice (Dec. 4, 2017), https://nij.ojp.gov/topics/articles/research-body-worn-cameras-and-law-enforcement [https://perma.cc/X4AA-KLG6]; Ben Miller, Just How Common Are Body Cameras in Police Departments? Government Technology (June 28, 2019), https://www.govtech.com/data/Just-How-Common-Are-Body-Cameras-in-Police-Departments.html [https://perma.cc/K44P-5LWH ] (no numbers have been published for 2019 or 2020.)
 Floyd v. City of New York, 959 F. Supp. 2d 668, 685 (S.D.N.Y. 2013). In Floyd, the court identified body-worn cameras as an exceptional way to prevent constitutional harms, noting that body-worn cameras “will provide a contemporaneous, objective record of stops and frisks.”
 Letter from Civil Rights Groups to the Axon AI Ethics Board 1–2 (Apr. 26, 2018), http://civilrightsdocs.info/pdf/policy/letters/2018/Axon AI Ethics Board Letter FINAL.pdf [http://perma.cc/6YJF-36EC] (noting that Axon’s body-worn camera systems, which should serve as transparency tools, are now being reduced to powerful surveillance tools that are concentrated in heavily policed communities.)
 Peter Hermann, Police Officers with Body Cameras are as Likely to Use Force as Those Who Don’t Have Them, Wash. Post (Oct. 20, 2017), https://www.washingtonpost.com/local/public-safety/police-body-camera-study-finds-complaints-against-officers-did-not-drop/2017/10/20/4ff35838-b42f-11e7-9e58-e6288544af98_story.html [https://perma.cc/77SQ-FPNZ]; see also Barak Ariel et al., Wearing Body Cameras Increases Assaults against Officers and Does Not Reduce Police Use of Force: Results from a Global Multi-Site Experiment, 13 Eur. J. Criminol. 744, 744–755 (2016).
 Lindsey Devers, Plea and Charge Bargaining, Bureau of Just. Assistance (Jan. 24, 2011), https://bja.ojp.gov/sites/g/files/xyckuh186/files/media/document/PleaBargainingResearchSummary.pdf [https://perma.cc/NAE6-K5EP].
 See Report to the United Nations on Racial Disparities in the U.S. Criminal Justice System, The Sentencing Project (Apr. 19, 2019), https://www.sentencingproject.org/publications/un-report-on-racial-disparities/ [https://perma.cc/A6FT-8MUL] (noting that racial disparities within the criminal justice system have led to distrust between law enforcement and the communities they police. “As of 2001, one of every three black boys born in that year could expect to go to prison in his lifetime, as could one of every six Latinos—compared to one of every seventeen white boys.”).
 Body-worn camera footage can be doctored and can also give the illusion of accuracy even as it shows a skewed perspective on events. See Lily Hay Newman, Police Bodycams Can Be Hacked to Doctor Footage, Wired (Aug. 11, 2018), https://www.wired.com/story/police-body-camera-vulnerabilities/ [https://perma.cc/S979-UGZ6]; see also The Illusion of Accuracy, Upturn (Nov. 2017), https://www.upturn.org/reports/2017/the-illusion-of-accuracy/ [https://perma.cc/C8ZX-HLPB].
 See State v. Durnwald, 837 N.E 2d 1234 (Ohio Ct. App. 2005) (the Court of Appeals of Ohio stated that “this court finds it incredible that such ‘accidental’ erasures continue to occur” after body-worn camera footage was lost.)
 Colleen Long, Police Union Files Suit over Release of Body Camera Footage, Associated Press (Jan. 9, 2018), https://apnews.com/b097372f125d4ebd817e7e807e57dde9 [https://perma.cc/DYP6-E77C]; Alfred Ng, Police Body Cameras at Protests Raise Privacy Concerns, CNet (June 9, 2020), https://www.cnet.com/news/police-body-cameras-at-protests-raise-privacy-concerns/ [https://perma.cc/3M5W-EMRP].
 Max Parrot, Law Firm Sues NYPD to Obtain Body Cam Footage of Deadly Maspeth Police-involved Shooting, QNS (Aug. 20, 2019), https://qns.com/story/2019/08/20/law-firm-files-sues-nypd-to-obtain-body-cam-footage-of-deadly-maspeth-police-involved-shooting/ [https://perma.cc/8XGR-9TCQ].
 Eric Umansky, The NYPD Isn’t Giving Critical Bodycam Footage to Officials Investigating Alleged Abuse, ProPublica (July 3, 2020), https://www.propublica.org/article/the-nypd-isnt-giving-critical-bodycam-footage-to-officials-investigating-alleged-abuse [https://perma.cc/54JZ-V3CP].
 Monica Nicklesburg, Seattle Will Order Police to Turn on Body Cameras During Protests Despite Privacy Concerns, GeekWire (June 8, 2018), https://www.geekwire.com/2020/seattle-will-order-police-turn-body-cameras-protests-despite-privacy-concerns/ [https://perma.cc/PSG7-SXFG].
 Matthew Guariglia, Victory! California Governor Signs A.B. 1215, EFF (Oct. 9, 2019), https://www.eff.org/deeplinks/2019/10/victory-california-governor-signs-ab-1215 [https://perma.cc/A74C-MGWT].
 Benjamin Wofford, The Genius of Protesting in Car Caravans, Washingtonian (June 1, 2020), https://www.washingtonian.com/2020/06/01/the-genius-of-protesting-in-car-caravans/ [https://perma.cc/4BTP-S7EU].
 Zach Whittaker, Police License Plate Readers are Still Exposed on the Internet, TechCrunch (Jan. 22, 2019), https://techcrunch.com/2019/01/22/police-alpr-license-plate-readers-accessible-internet/ [https://perma.cc/5E2H-GWMA].
 Demonstrations Force America to Reckon With Contentious Past, N.Y. Times (June 16, 2020), https://www.nytimes.com/2020/06/16/us/george-floyd-rayshard-brooks-protests.html [https://perma.cc/M54X-7JKD].
 For one example of law enforcement use of License Plate Readers, see the Minneapolis City Police Department’s Police Conduct Oversight Commission Surveillance Whitepaper, (Mar. 2019), http://www2.minneapolismn.gov/www/groups/public/@civilrights/documents/webcontent/wcmsp-218180.pdf [https://perma.cc/A5ZQ-NERS].
 Automated License Plate Readers, Minn. Stat. § 13.824 (2019), https://www.revisor.mn.gov/statutes/cite/13.824 [https://perma.cc/CL4U-5C83].
 Patrick McGreevy, LAPD Automatic License Plate Readers Pose a Massive Privacy Risk, Audit Says, L.A. Times (Feb. 13, 2020), https://www.latimes.com/california/story/2020-02-13/privacy-risks-automatic-license-plate-readers-lapd [https://perma.cc/G9LW-P6XD].
 Privacy Impact Assessment Report for the Utilization of License Plate Readers, Int’l Ass’n of Chiefs of Police (Sep. 2009), https://www.theiacp.org/sites/default/files/all/k-m/LPR_Privacy_Impact_Assessment.pdf [https://perma.cc/L442-G3MN].
 Cell Site Location Information, Electronic Frontier Found. (Mar. 28, 2019), https://www.eff.org/files/2019/03/28/csli_one-pager.pdf [https://perma.cc/VG35-MRYJ].
 Matthew Guariglia, How to Identify Visible (and Invisible) Surveillance at Protests, Electronic Frontier Found. (June 4, 2020), https://www.eff.org/deeplinks/2020/06/how-identify-visible-and-invisible-surveillance-protests [https://perma.cc/R4ZW-JX9H] (identifying cell-site simulators).
 See, e.g., Denise Lavoie, Geofence Warrants to be Tested in Virginia Bank Robbery Case, Associated Press (July, 3, 2020), https://apnews.com/ae0dbee812feefe4f54d3539885f9f54 [https://perma.cc/9P3T-R9YD] (recent case involving a geofence warrant).
 Emily Glazer & Patience Haggin, Political Groups Track Protesters’ Cell Phone Data, WSJ (June 14, 2020), https://www.wsj.com/articles/how-political-groups-are-harvesting-data-from-protesters-11592156142 [https://perma.cc/D979-473S] (where a massive dragnet of protester data was harvested by political groups).
 S.B. S1838, 2020 Leg., Reg. Session (N.Y. 2020), https://www.nysenate.gov/legislation/bills/2019/s8183 [https://perma.cc/CR5J-822Z].
 Anton Sobolev, et al., News and Geolocated Social Media Accurately Measure Protest Size Variation, 1 Am. Pol. Sci. Rev. 1-9 (June 30, 2020) (a recent research paper noting the use of CLSI to discern crowd size for various protests, including the 2017 Women’s Marches).
 Caroline Haskins, Protesters Marching Across the Country Had No Idea That a Tech Company Was Spying on Them, Buzzfeed (June 25, 2020), https://www.buzzfeednews.com/article/carolinehaskins1/protests-tech-company-spying [https://perma.cc/63V8-D4PQ].
 Sobolev et al., supra note 37; see also Assaf Rotman & Michael Shalev, Using Location Data From Mobile Phones to Study Participation in Mass Protests, 1 Sociological Methods & Research 1-56 (2020).
 Glazer & Haggin, supra note 35; see also Joseph Cox, I Gave a Bounty Hunter $300. Then He Located Our Phone, MotherBoard (Jan. 8, 2019), https://www.vice.com/en_us/article/nepxbz/i-gave-a-bounty-hunter-300-dollars-located-phone-microbilt-zumigo-tmobile [https://perma.cc/5MQE-8JP3] (recent worrisome uses of CSLI include political groups tracking individuals for marketing purposes, and telecom companies selling consumer CSLI to bounty hunters).
 Israel Deploys Drones to Drop Tear Gas on Gaza Protesters, The Times of Israel (Mar. 31, 2018), https://www.timesofisrael.com/israel-deploys-drones-to-drop-tear-gas-on-gaza-protesters/ [https://perma.cc/8M2X-TKN2].
 From Privacy From Aerial Surveillance: Recommendations for Government Use of Drone Aircraft, ACLU (Dec. 2011), https://www.aclu.org/files/assets/protectingprivacyfromaerialsurveillance.pdf [https://perma.cc/VG4J-UBHL].
 John D. McKinnon & Michelle Hackman, Drone Surveillance of Protests Comes Under Fire, Wall St. J. (June 10, 2020), https://www.wsj.com/articles/drone-surveillance-of-protests-comes-under-fire-11591789477 [https://perma.cc/2ERU-K94C].
 Marie Szaniszlo, Lynch, Pressley Launch Investigation into Trump Administration’s Drone Surveillance of Protesters, Boston Herald (June 9, 2020), https://www.bostonherald.com/2020/06/09/lynch-pressley-launch-investigation-into-trump-administrations-drone-surveillance-of-protesters/ [https://perma.cc/MKZ8-5WXC].
 Karl Jacoby, Why the CBP’s Presence at the DC Protests Should Alarm All of Us, Politico (June 10, 2020), https://www.politico.com/news/magazine/2020/06/10/cbp-protests-border-zone-312151 [https://perma.cc/XVH9-V7QC]; see also Caren Kaplan & Andrea Miller, Drones as “Atmospheric Policing”: From US Border Enforcement to the LAPD, 31 Pub. Culture 419 (2019) (drones are mechanisms for atmospheric policing).
 Michael S. Schmidt, Secret Service Arrests Man After Drone Flies Near White House, N.Y. Times (May 14, 2015), https://www.nytimes.com/2015/05/15/us/white-house-drone-secret-service.html [https://perma.cc/EGX8-V4VQ] (while the military and law enforcement may be able to fly drones around or near the White House for example, individual hobbyists, journalists, or civilians are barred and criminalized from doing the same).
 Rebecca Heilweil, Members of Congress Want to Know More about Law Enforcement’s Surveillance of Protesters, Recode (June 10, 2020), https://www.vox.com/recode/2020/5/29/21274828/drone-minneapolis-protests-predator-surveillance-police [https://perma.cc/DYZ2-HKU2].
 Elizabeth McClellan, Facial Recognition Technology: Balancing the Benefits and Concerns, 15 J. BUS. & TECH. L. 363, 364 (2020), https://digitalcommons.law.umaryland.edu/jbtl/vol15/iss2/7/ [https://perma.cc/LP26-TVKV].
 Salvador Hernandez, A Database of Gang Members in California Included 42 Babies, BuzzFeed News (Aug. 11, 2016), https://www.buzzfeednews.com/article/salvadorhernandez/database-of-gang-members-included-42-babies [https://perma.cc/43BC-LSSK].
 Dave Gershgorn, Facial Recognition Is Law Enforcement’s Newest Weapon Against Protesters, One Zero (June 3, 2020), https://onezero.medium.com/facial-recognition-is-law-enforcements-newest-weapon-against-protestors-c7a9760e46eb [https://perma.cc/P4GU-VUHU]; see also Queenie Wong, Police Use of Social Media is Under a Microscope Amid Protests, CNET (June 11, 2020), https://www.cnet.com/news/police-use-of-social-media-is-under-a-microscope-amid-protests/ [https://perma.cc/3VH3-T9TT].
 See generally Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code (2019) (arguing that racism, as a means of classifying and characterizing people, is a form of technology—one that is both pervasive and incredibly normalized.)
 Patrick Grother et al., Face Recognition Vendor Test (FRVT), National Institute of Standards and Technology (Dec. 2019), https://nvlpubs.nist.gov/nistpubs/ir/2019/NIST.IR.8280.pdf [https://perma.cc/6586-UMD6]. These findings were based on analysis in one-to-one matching; a notable exception to the findings of bias was that some algorithms that were developed in Asian countries did not demonstrate significant rates of false positives in one-to-one matching between Asian and Caucasian faces. One posited reason for this exception is that the foreign-developed algorithm utilized more diverse training data, suggesting that such data could produce more equitable outcomes in facial recognition algorithms.
 Sidney Fussel, A Flawed Facial-Recognition System Sent This Man to Jail, Wired (June 24, 2020), https://www.wired.com/story/flawed-facial-recognition-system-sent-man-jail/ [https://perma.cc/BAK8-AT9W].
 Clare Garvie et al., The Perpetual Line-Up: Unregulated Police Face Recognition in America, Geo. L. Ctr. on Privacy & Tech. (2016), https://www.perpetuallineup.org [https://perma.cc/NSD5-38VS].
 Kevin Rector and Alison Knezevich, Maryland’s Use of Facial Recognition Software Questioned by Researchers, Civil Liberties Advocates, The Baltimore Sun (Oct. 18, 2016), https://www.baltimoresun.com/news/crime/bs-md-facial-recognition-20161017-story.html [https://perma.cc/UAW3-543A]; see also Russell Brandon, Facebook, Twitter, and Instagram Surveillance Tool Was Used to Arrest Baltimore Protestors, The Verge (Oct. 11, 2016), https://www.theverge.com/2016/10/11/13243890/facebook-twitter-instagram-police-surveillance-geofeedia-api [https://perma.cc/Y5V4-BYEF].
 Elizabeth Lopatto, Clearview AI CEO Says, “Over 2,400 Police Agencies” Use Its Facial Recognition Software, VERGE (Aug. 26, 2020), https://www.theverge.com/2020/8/26/21402978/clearview-ai-ceo-interview-2400-police-agencies-facial-recognition [https://perma.cc/9X9T-VECD].
 Kashmir Hill, Unmasking a Company That Wants to Unmask Us All, N.Y. Times (Jan. 20, 2020, https://www.nytimes.com/2020/01/20/reader-center/insider-clearview-ai.html [https://perma.cc/NK5E-SAVX].
 Kashmir Hill, Twitter Tells Facial Recognition Trailblazer to Stop Using Site’s Photos, N.Y. Times (Jan. 22. 2020), https://www.nytimes.com/2020/01/22/technology/clearview-ai-twitter-letter.html [https://perma.cc/88RM-EJNR] (companies include Twitter Inc., Facebook Inc.—including Instagram—and Alphabet Inc.’s YouTube); see also Caroline Haskins, et. al, Clearview AI Wants to Sell Its Facial Recognition Software to Authoritarian Regimes Around the World, Buzzfeed News (Feb. 5, 2020), https://www.buzzfeednews.com/article/carolinehaskins1/clearview-ai-facial-recognition-authoritarian-regimes-22 [https://perma.cc/MSY8-5WAA].
 Google, YouTube, Venmo and LinkedIn Send Cease-and-desist Letters to Facial Recognition App That Helps Law Enforcement, CBS News (Feb. 5, 2020), https://www.cbsnews.com/news/clearview-ai-google-youtube-send-cease-and-desist-letter-to-facial-recognition-app/ [https://perma.cc/ZU7M-EZZ3].
 Natasha Singer, Amazon is Pushing Facial Technology That a Study Says Could Be Biased, N.Y. Times (Jan. 24, 2019), https://www.nytimes.com/2019/01/24/technology/amazon-facial-technology-study.html [https://perma.cc/L54V-H8F5]; see also Emily Birnbaum, Amazon’s Ring has 29 New Police Agreements Since the Killing of George Floyd, Protocol (June 12, 2020), https://www.protocol.com/amazons-ring-police-partnerships [https://perma.cc/5LSV-NPFM].
 We Are Implementing a One-year Moratorium on Police Use of Rekognition, Amazon (June 10, 2020), https://blog.aboutamazon.com/policy/we-are-implementing-a-one-year-moratorium-on-police-use-of-rekognition [https://perma.cc/LGL4-NVVQ].
 Rebecca Heilweil, Big Tech Companies Back Away From Selling Facial Recognition to Police. That’s Progress., Vox (June 11, 2020), https://www.vox.com/recode/2020/6/10/21287194/amazon-microsoft-ibm-facial-recognition-moratorium-police [https://perma.cc/T95A-GHK6].
 Hanna Ziady, IBM is Cancelling Its Facial Recognition Programs, CNN (June 9, 2020), https://www.cnn.com/2020/06/09/tech/ibm-facial-recognition-blm/index.html [https://perma.cc/BZB7-C3FR].
 Advocacy groups that have called for a complete ban of facial recognition technologies, include: the ACLU, Amnesty International, Algorithmic Justice League, Electronic Frontier Foundation, and Fight for the Future. Amnesty International Calls for the Ban on the Use of Facial Recognition Technology for Mass Surveillance, Amnesty International (June 11, 2020), https://www.amnestyusa.org/wp-content/uploads/2020/06/061120_Public-Statement-Amnesty-International-Calls-for-Ban-on-the-Use-of-Facial-Recognition-Technology-for-Mass-Surveillance.pdf [https://perma.cc/MX7G-FJ2P]; see also Heilweil, supra note 69.
 See, e.g., John G. Browning, The Battle over Biometrics, Texas Bar (Oct. 2018), https://www.texasbar.com/AM/Template.cfm?ection=Content_Folders&ContentID=42128&Template=/CM/ContentDisplay.cfm [https://perma.cc/WP2H-K49A].
 See Fact Sheet: Justice in Policing Act of 2020, CBC, https://judiciary.house.gov/uploadedfiles/fact_sheet_justice_in_policing_act_of_2020.pdf [https://perma.cc/LR88-7WUU].
 Melissa Hellman, Special Sunglasses, License-plate Dresses: How to be Anonymous in the Age of Surveillance, Seattle Times (Jan. 12, 2020), https://www.seattletimes.com/business/technology/special-sunglasses-license-plate-dresses-juggalo-face-paint-how-to-be-anonymous-in-the-age-of-surveillance/.
 Amarjot Singh et al., Disguised Face Identification (DFI) with Facial KeyPoints Using Spatial Fusion Convolutional Network, IEEE International Conference on Computer Vision Workshops (2017), https://arxiv.org/pdf/1708.09317.pdf [https://perma.cc/V5E6-VZQ2].
 Joseph Cox, Thousands of People are Monitoring Police Scanners During the George Floyd Protest, Vice (June 1, 2020), https://www.vice.com/en_us/article/pkybn8/police-radio-scanner-apps-george-floyd-protests [https://perma.cc/B5J3-YRDH].
 Sara Morrison & Adam Clark Estes, How Protesters are Turning the Tables on Police Surveillance, Vox (June 12, 2020), https://www.vox.com/recode/2020/6/12/21284113/police-protests-surveillance-instagram-washington-dc [https://perma.cc/K5RZ-RAPB].
 Filming and Photographing the Police, ACLU (last visited Aug. 20, 2020) https://www.aclu.org/issues/free-speech/photographers-rights/filming-and-photographing-police [https://perma.cc/BQ2X-AT5J].
 See #ProtectBlackDissent: Campaign to End Surveillance of Black Activists, ACLU (last visited Aug. 20, 2020), https://www.aclu.org/issues/racial-justice/protectblackdissent-campaign-end-surveillance-black-activists [https://perma.cc/E8N3-EVD2].
 Blur Tools for Signal, Signal (June 3, 2020), https://signal.org/blog/blur-tools/ [https://perma.cc/G9XQ-6FWJ].
 Peter Aldous, The FBI Used Its Most Advanced Spy Plane To Watch Black Lives Matter Protests, BuzzFeed News (June 20, 2020), https://www.buzzfeednews.com/article/peteraldhous/fbi-surveillance-plane-black-lives-matter-dc [https://perma.cc/H98E-57CN]; see also, Roland G. Fryer, An Empirical Analysis of Racial Differences in Police Use of Force, 27 J. POL. ECON. 1210-1261 (2019) (law enforcement interactions following an identification are more likely to be charged if the protester is a person of color—with Black and Hispanic individuals more than 50 percent likely to experience some use of force compared to Whites.)
 Lily Hay Newman, The Same Old Encryption Debate Has a New Name, Wired (Oct. 3, 2019), https://www.wired.com/story/encryption-wars-facebook-messaging [https://perma.cc/6LVT-4S3R]; see also Andy Greenberg, Signal Is Finally Bringing Its Secure Messaging to the Masses, Wired (Feb. 14, 2020), https://www.wired.com/story/signal-encrypted-messaging-features-mainstream [https://perma.cc/ZM46-LY6H].
 See Social Media Surveillance, Electronic Frontier Found. (last visited Aug. 20, 2020) https://www.eff.org/issues/social-media-surveilance [https://perma.cc/V8BW-QE22].
 Brandon E. Patterson, Black Lives Matter Organizers Labeled as “Threat Actors” by Cybersecurity Firm, Mother Jones (Aug. 3, 2015), https://www.motherjones.com/politics/2015/08/zerofox-report-baltimore-black-lives-matter/ [https://perma.cc/BK5C-Y238].
 John Koetsier, Hong Kong Protestors Using Mesh Messaging App China Can’t Block: Usage Up 3685%, Forbes (Sep. 2, 2019), https://www.forbes.com/sites/johnkoetsier/2019/09/02/hong-kong-protestors-using-mesh-messaging-app-china-cant-block-usage-up-3685/#720342de135a [https://perma.cc/23JU-F7GR].
 Lorenzo Franceschi-Bicchierai, Activists are Using Traffic Cameras to Track Police Brutality, Vice (June 15, 2020), https://www.vice.com/en_us/article/y3zp55/activists-are-using-traffic-cameras-to-track-police-brutality [https://perma.cc/PP8D-VA94].
 Will Richards, Anonymous Back Chicago Police Radios to Play NWA’s ‘Fuck Tha Police’, NME (June 1, 2020), https://www.nme.com/news/music/anonymous-hack-chicago-police-radios-to-play-nwas-fuck-tha-police-2680017 [https://perma.cc/HM7W-SURA].
 John Schwartz, Internet Activist, a Creator of RSS, Is Dead at 26, Apparently a Suicide, N.Y Times (Jan. 13, 2013), https://www.nytimes.com/2013/01/13/technology/aaron-swartz-internet-activist-dies-at-26.html [https://perma.cc/YBW2-BEYF].
 A San Diego woman who organized a protest could face a misdemeanor charge for allegedly encouraging others to violate stay-home orders meant to slow the spread of COVID-19. David Hernandez, ‘I was born in America with constitutional rights:’ San Diego Protest Organizer Could Face Criminal charge, San Diego Union-Tribune (Apr. 22, 2020), https://www.sandiegouniontribune.com/news/public-safety/story/2020-04-22/san-diego-submits-case-against-weekend-protester-for-possible-criminal-charge [https://perma.cc/4HY2-EPU3].
 Li Cohen, Colorado Passes Sweeping Police Reform Bill, CBS News (June 19, 2020), https://www.cbsnews.com/news/colorado-passes-sweeping-police-reform-bill/ [https://perma.cc/4JG7-RQNU].
 Xu Xu, To Repress or to Co‐opt? Authoritarian Control in the Age of Digital Surveillance, Am. J. of Pol. Sci. (Apr. 7, 2020), https://doi.org/10.1111/ajps.12514 [https://perma.cc/3LUQ-QNZR].
Recommended Citation: Katelyn Ringrose & Divya Ramjee, Watch Where You Walk: Law Enforcement Surveillance and Protester Privacy, 11 Calif. L. Rev. Online 349 (Sept. 2020), http://www.californialawreview.org/law-enforcement-surveillance-protester-privacy.