Man or Machine? Algorithms in the American Criminal Justice System

By: Shalaka Joshi

As technology has become a more integral part of our world, it has been incorporated into how government interacts with the people. Across the country, technology is being used to try to streamline government programs in diverse areas, among them to reform the problems of old criminal justice practices like cash bail. Cash bail is the bond system in which people arrested, for even low-level offenses, are imprisoned before trial, unless they pay a fee. These fees can be completely arbitrary and often are beyond the means of those arrested, leaving the poorest arrestees, often minorities, locked up without even being convicted of a crime.

As this practice has increasingly come under fire, many jurisdictions across the country are moving away from this cash system and toward one based on risk-assessment algorithms that attempt  to calculate the likelihood of a given individual committing a crime. However, it is becoming apparent that while these algorithms are intended to remove human biases, they do not actually do so. Therefore, to ensure that these technologies are fairly implemented, it is important for the government to set regulatory standards to ensure the algorithm’s fairness and transparency.

While these algorithms are trying to reform the flaws of the old cash bail system, it has become clear that they too have biases In the United States, while African Americans and Hispanics are only approximately 32% of the US population, they make up almost 56% of all incarcerated people. Because these statistics reflect decades of police practices that included racial profiling and surveillance of minority and lower income communities, these algorithms have been trained on data that is fundamentally flawed. Additionally, in some cases, police departments had been found to have a culture of manipulating or falsifying data under intense political pressure to bring down crime rates. Therefore, the results of these algorithms are influenced by structural racism that distorts the data.

Algorithms are difficult to understand, and how the data that goes into them creates the  results we see seems inexplicable. However, there are principles that can be followed to fix these biases, starting with ensuring that the data that the algorithms are trained on is representative of the entire population. Additionally, transparency can be achieved by regularly releasing information about the inputs and outputs of these algorithms to be audited by trusted third-party organizations like civil rights groups.

Finally, a series of checks should be installed to ensure that these life altering decisions, especially within the criminal justice system, are not made entirely by human or formula, but by a combination of both. For these principles to be implemented, they must be transformed into regulation, as government is the only institution with the power to require adherence to these standards.

While there is always concern that this would simply make the government more intrusive, and put more sensitive information into its hands, giving the government the power to regulate this application of algorithms in public life will ensure some accountability. Leaving this to the private sector will hurt those who are directly affected by these decisions, as they will be left without any means of understanding why.

Of course, only fixing these algorithms will not solve the flaws in broader society that have led to the racial and class gap over incarceration. However, ensuring that these algorithms are fair is a step in the right direction towards using government data to achieve a positive goal.

While the discussion about criminal justice reform, such as the recently passed First Step Act  , has focused on the federal government, in fact, the vast majority of incarcerated people are held in facilities controlled by state and local government. It is therefore essential  to advocate for the regulation of algorithms at all levels of government, and ultimately ensure that the algorithms used in every court in the country are held to the same standards of representativeness, fairness, and transparency.