Policing by Numbers

A Review of Andrew Guthrie Ferguson’s The Rise of Big Data Policing: Surveillance, Race, and the Future of Law Enforcement

Mario Diaz-Perez

Mario Diaz-Perez is PhD Candidate in the History of Consciousness Department (UCSC) and teaches in the Prison University Program at San Quentin Prison. His dissertation is called “W.E.B. Du Bois and the Black Marxist Critique of Liberalism.”

In the late 19th century, newly centralized police departments greatly improved their capacity to respond to criminal activity in urban spaces by adopting a signal system of “call boxes.” These allowed beat cops to request support or transmit information directly to police headquarters. Wagons on the ready would transport the required resources and manpower to flare-ups of public disorder, labor strikes and violent crimes. This system, in its quaint simplicity, was a far-cry from the high-tech command centers now in place in many major cities. These centers, like the Los Angeles Police Department’s Real-Time Analysis and Critical Response Division, predict criminal activity based on software capable of integrating and analyzing vast pools of data. The data draws from much more than ordinary police records which once served as the basis for forming a picture of criminal behavior. Now, crime data, social media, personal data, associational data, environmental data and a growing surveillance infrastructure are each equally mined for patterns and signals of expected activity. For some, the statistical correlations involved in “predictive policing” inform an efficient crime prevention strategy that is free of personal bias and therefore less likely to generate the harsh tactics typically applied by police in poor neighborhoods and communities of color. For others, racism and prosecutorial prerogative dim any hope that such a tool can be used without criminalizing a larger proportion of America’s already heavily incarcerated populations. The sheer scope of personal information potentially in the hands of law enforcement at any particular time suggests that what we are witnessing is the development of a dragnet surveillance system of yet unknown proportions.

In The Rise of Big Data Policing: Surveillance, Race, and the Future of Law Enforcement, Andrew Guthrie Ferguson, a professor of Law at the University of the District of Columbia, critically examines the effects big data algorithms are having on the operational strategies of police departments. Often responding to budgetary and manpower restrictions as well as recurring police brutality fiascos, police departments have sought to ‘do more with less,’ pooling large swathes of statistical and surveillance information. In Chicago, Los Angeles, New York and other cities, this approach affects how departments select where, when and amongst whom to deploy police resources. Ferguson brings to bear a formidable legal expertise on these emerging trends in policing, especially where surveillance technologies impinge on the Fourth Amendment rights of American citizens. Because many CCTV and digital surveillance technologies involve such wide scope and extended duration, many of them involve infringements of the right to be free from unreasonable searches and seizures without probable cause. Over the years, Ferguson has commented on the use of ‘big data’ by police departments – following the rise of companies like Palantir and PredPol whose ‘predictive policing’ software has transformed many of the ways police department interact with the cities under their jurisdiction.

‘Predictive targeting’ might include person-specific tools indicating who is likely to be involved in gun violence (victim or perpetrator) or which individuals are at high-risk of repeat offenses. The data informing such assessments comes not only from police records but is also cross-referenced with information from social services, employment history, family and social connections and sometimes health records. Ferguson concedes these assessments might betoken greater objectivity and a fuller, composite picture of criminal activity. However, embedded within them are the means for punishing individuals on the basis of the social fabric within which they live. If someone accused of a crime is unemployed and lives in a crime-ridden neighborhood with family associations with known criminals, we learn, a judge is more likely to constrict the terms of their bail. It might also provide ready-at-hand argumentation for prosecutors to push for the kinds of harsher sentences that induce the tendency for criminal cases to be decided by plea-deals – largely responsible for the biggest trends in mass incarceration (Pfaff 2017) [1].

Despite its technocratic veneer and ‘race-blind’ assessments, such tools often yield discriminatory results based on the demographic inputs. Relying on socioeconomic variables in jurisdictions where these easily correlate with race sharpens the blunt and punitive arm of the law with a marked efficiency. No algorithm that informs a judge, prosecutor or police officer can undo the programming syntax embedded in the social structures that make certain facts count more than others.

On this issue – where the legacies of discrimination and structural racism are the inputs of algorithmically mined data – Ferguson’s analysis falls short of the critical standards he brings to bear on surveillance and Fourth Amendment issues. Although ‘race’ is included in the title of the study, its role in the optic of crime statistics and in the psychological projection of criminality is not given the critical historical reflection it deserves. In the United States, crime statistics have been largely coeval with discourses about black criminality – from one based on ideas of biological inferiority to another, largely with us today, of cultural pathology (Muhammad 2010). Any cursory scan of urban criminology since the 1960s will yield abundant evidence of a largely liberal sociological discourse on the role of chronic unemployment, disintegration of the family unit and the social incapacitation created by increasing levels of incarceration. Each of these have reciprocal effects, but how they interface with the emerging police risk-assessment tools contain clues to their continuity with past policing strategies and what we can reasonably expect in the near future. Thus, after considering some of the more acute conflicts of surveillance and race in a section called “Black Data,” Ferguson meekly seeks resolution through the following Panglossian injunction: “Black date policing must absorb the complexity of new technology and old practices. Police must respond by acknowledging challenges and addressing the complexities of racial bias, a lack of transparency, and legal distortions – good and bad. The black data problem can be overcome but only by confronting the complexities rather than blindly pretending to turn the page to something new” (Ferguson 2017, 142). Such is the plodding prose of most policy wonks whenever an issue of a directly political and racial valence is brought to the fore. Ferguson certainly is well-intentioned and ably operates under a trenchant legal analytic, but legal concepts without historical content are empty. His study would have been greatly strengthened by consideration of the historical evolution of legal regimes of incarceration and the statistical and criminological theories that supported them.

Much of the anxiety induced by the rise of big data policing comes not so much from the analytical tools applied to police records, but from the still opaque nexus through which “data brokers” and police departments consolidate information. Companies like Acxiom, Equifax, CoreLogic and many others that specialize in aggregation and the analytical targeting of data will repackage personal and consumer information for police departments and federal agencies. How this information is collected, by what algorithmic rules, and to whom it is sold remains opaque because algorithmic data collection is the proprietary domain of each of company’s service and product. In The Black Box Society: The Secret Algorithms that Control Money and Information, Frank Pasquale has drawn a disconcerting picture of these “black boxes” of information and the conduits of their transmission. The ethos of data-sharing only seems to apply at the apex of corporations and state institutions where both seem increasingly joined at the hip. More broadly, this dynamic heralds an emergent and paradoxical neoliberal fusion of market and state. At the local and operational level of law enforcement, this dynamic is likely to blur the lines between police investigations and police intelligence – signaling a potentially larger erosion of civil liberties as they apply to due process rights and the freedom from unlawful searches and seizures (Pasquale 2015, 47).

Despite this pessimistic picture of the future, Ferguson offers a few silver linings which might point the way forward should police departments adopt some of the demands being articulated by community oversight boards and liberal reformers. Many of them come down to steering police departments to handle a number of criminal problems with the resources normally associated with matters of public health policy. For example, the Chicago Police Department has maintained a data-driven “heat list” which generates a rank order of individuals at risk of involvement in gun violence, either as a victim or perpetrator according to established associational data and trends of vendetta violence in South side neighborhoods. Police send letters and make visits with social workers and community members to the homes of these individuals to deter them from violence. The program, largely unsuccessful in halting the murder rate, nevertheless points to other possible trends in dispatching social services with police where crimes that are less than capital offenses are involved. Ultimately, big data can point to where social services are needed (mental health, health care, and social provision) and could help prevent suicides, drug overdoses and violence.

Another area where Ferguson hopes to draw attention is what he terms “blue data” or statistics about police operations. Around the time of the first Black Lives Matter protests, then Attorney General Eric Holder complained: “The troubling reality is that we lack the ability right now to comprehensively track the number of incidents of either uses of force directed at police officers or uses of force by police…This strikes many – including me – as unacceptable.” Ferguson shows big data has revealed patterns of police brutality that are quite obvious yet remain largely unaddressed. For one, a relatively small number of police officers are involved in ‘official misconduct’ in the use of force. Additionally, the number of stressful cases and incidents a police officer is involved in is directly related to the likelihood that they will be involved in the use of force with civilians. Big data policing enhances the possibilities for monitoring police and pressuring departments to adopt new standards regarding the use of force. Fraternal orders of police officers, organizations whose very names betray their founding anti-union sentiments, largely oppose the use of such metrics to ‘police the police.’

Whether this emergent big data intelligence is ever used for purposes other than incarcerating people depends on movements pushing for the democratization of police departments. Whether community-based advisory boards working with police departments can operate with the savviness and acumen to avoid the pitfalls of ‘big data’ policing remains to be seen – many seem to have learned the hard lessons of the eighties and nineties regarding their own involvement in the harsher sentencing during the ‘war on drugs.’ Absent anything else, such community involvement remains a better option than anything being offered on high. Quantification and big data can be a powerful tool for law enforcement to create distance from the communities they police – of rendering lives and social worlds into a pseudo-objectivity of flows, spatial distributions and networks. Yesterday, the numbers that legitimated the police were arrests and convictions. Today, in an era of comparatively low levels of violent crime, the public clamors for fewer murders by police officers and relief from the social incapacitation brought on by the stigma of incarceration. Which numbers count point the way forward for any hope of reform.

Notes

[1] Pfaff’s book can be seen as an extended argument or corrective to what he calls ‘the Standard Story’ that the major driver of mass incarceration has been the ‘war on drugs.’ This narrative, in its partiality, draws well-deserved ire upon the most overtly racist prosecutorial regimes of drug cases without tarrying with the less palatable reality that a larger driver of prison population dynamics involves crimes with victims. His book is an extended reckoning with the role that state prosecutors have played in consistently pushing for the harshest sentencing in property and violent crimes. How we punish the perpetrators of crimes with victims in an age of relatively low crime rates is one of the most important questions for the prison-abolition movement.

References

Ferguson, Andrew G. 2017. The Rise of Big Data Policing: Surveillance, Race, and the Future of Law Enforcement. New York, New York University Press.

Muhammad, Khalil Gibran. 2010. The Condemnation of Blackness: Race, Crime and the Making of Urban America. Boston, Harvard University Press.

Pasquale, Frank. 2015. The Black Box Society: The Secret Algorithms that Control Money and Information. Boston, Harvard University Press.

Pfaff, John F. 2017. Locked In: the True Causes of Mass Incarceration and How to Achieve Real Reform. New York, Basic Books.

An interdisciplinary, open-access, peer reviewed online journal.