Stanford, California - If future scholars of American history remember 2015 for one defining issue, it may well be the rising public uproar over ugly and often fatal encounters between police and black citizens.
The police shooting of Michael Brown in Ferguson, Missouri, along with videos of police killings in New York City, Cleveland and Chicago, ignited the Black Lives Matter movement. Equally graphic videos from Texas – of a police officer roughing up teenage girls at a pool party or of the officer who threatened to use a Taser on Sandra Bland after pulling her over for failing to signal a lane change – intensified charges that police unfairly target African Americans and other minorities.
As gripping as such incidents are, they still amount to individual anecdotes that can steer a narrative. To provide an unbiased, data-driven analysis of such issues, researchers at Stanford University's School of Engineering have launched what they call the Project on Law, Order & Algorithms.
The project is led by computational social scientist Sharad Goel, an assistant professor of management science and engineering. He also teaches a course at Stanford Engineering that explores the intersection of data science and public policy issues revolving around policing.
Among other activities, Goel's team is building a vast open database of 100 million traffic stops from cities and towns around the nation. The researchers have already gathered data on about 50 million stops from 11 states, recording basic facts about the stop – time, date and location – plus any available demographic data that do not reveal an individual's identity. These demographics might include race, sex and age of the person.
Based on its work thus far, the Knight Foundation recently awarded the team a $310,000 grant to at least double the size of the database, compiling data from as many as 40 states, going back five to 10 years.
The ongoing project has several purposes. The first and most topical goal is to produce a statistical method to assess whether police discriminate against people on the basis of race, ethnicity, age or gender, and, if so, how frequently and under what circumstances. A second but equally important purpose is to help law enforcement agencies design practices that are more equitable and effective at reducing crime.
Ultimately, Goel and his colleagues plan to take the know-how that they will have gained through their analysis of traffic stops and create a software toolkit that others could use to acquire data from municipal or county governments and perform similar analyses. Their idea is to enable other academic researchers, journalists, community groups and police departments to do the same sort of data mining that today requires the expertise of experienced researchers like the members of Goel's team.
Precinct or prejudice?
The public appetite for accurate and comprehensive data has increased sharply. In the aftermath of Michael Brown's death in Ferguson, the U.S. Justice Department concluded that Ferguson's police had routinely targeted black residents and frequently violated their civil rights. African Americans accounted for two-thirds of Ferguson's population, but 85 percent of all traffic stops, 90 percent of all tickets and 93 percent of all arrests. Statewide, a separate report by Missouri's attorney general, as described in the New York Times, found that police were 75 percent more likely to stop black drivers than white ones.
"Technically, much of this is already public data, but it's often not easily accessible, and even when the data are available, there hasn't been much analysis," Goel said.
When researchers do take a deep dive into the data, the results can be as eye-opening for police departments as they are for community groups.
In "Precinct or Prejudice," a new study of New York City's stop-and-frisk policies, Goel and two colleagues found that police were indeed stopping and searching blacks and Hispanics at disproportionate rates. Focusing on about 760,000 stops in which police officers stopped and frisked people on the suspicion of holding an illegal weapon, the researchers found that African Americans who had been stopped were significantly less likely to have a weapon than whites who had been stopped.
When the researchers analyzed the data to discover why, they found that the biggest reason for the racial disparity was the fact that police focused their stop-and-frisk efforts in high-crime precincts heavily populated by minorities. Yet even after adjusting for the effects of location, they found that blacks and Hispanics were stopped a disproportionate amount of the time.
Perhaps the most important finding in "Precinct or Prejudice," however, was that New York City police could have recovered the majority of the weapons by carrying out only a tiny fraction of stop-and-frisk operations. Analyzing a very long list of factors that police officers cited as reasons for stopping and frisking people, the researchers found that only a handful had any predictive value. Seizing on hints of "furtive movement," for example, was almost useless.
In fact, the researchers concluded, if the police had conducted stop-and-frisk operations based on just three factors – a suspicious bulge, a suspicious object, and the sight or sound of criminal activity – they could have found more than half of all the weapons they did find with only 6 percent as many stops.
Predicting crime
Goel is keenly aware that technologies for "predictive modeling," such as using data to predict whether a person is likely to re-commit a violent crime, can have a chilling side. But he notes that a rigorous randomized control trial of a predictive tool used by Philadelphia parole authorities appeared to make life easier for parolees without increasing their risk of re-violation.
"There are all kinds of ways this can go wrong," Goel cautioned. "On the other hand, this can be a win-win situation. Everybody wants to reduce crime in a way that is supportive of the community. We'd like to help law enforcement agencies make better decisions – decisions that are more equitable, efficient and transparent."
Beyond building the database of traffic stops, Goel and his colleagues are using statistical tools to improve other aspects of the judicial system. In one effort, the researchers are working with the district attorney of a large city to improve pre-trial detention practices. In many cases, people arrested on minor crimes cannot afford to make bail and remain stuck in jail for weeks while they await trial.
"I've been amazed by all the interest on campus in this computational approach to criminal justice," Goel said. "In my Law, Order & Algorithms class, students from departments across the university are working together on projects that address some of the most pressing issues in the criminal justice system, from detecting discrimination to improving judicial decisions."