Chances Are (34 page)

Read Chances Are Online

Authors: Michael Kaplan

BOOK: Chances Are
5.84Mb size Format: txt, pdf, ePub
From right to left, the hypotheses—the “to be proved's”—range from those favoring the prosecutor to those favoring the defendant; each stands at the head of a chain of evidence, direct or circumstantial. The symbols at the nodes in the chain plot the 14 different types of fact—including facts as alleged, as seen by the court, and as generally accepted. Each fact can be marked with its inherent credibility, from “provisional credit” through belief and doubt to strong disbelief. Explanatory or corroborative evidence stands next to its relevant fact, drawing off or adding on credibility. Hearsay, contradiction, and confused testimony are neither resolved nor excluded but given their weight and allowed to make their contribution to the “net persuasive effect of a mixed mass of data.”
The ingenuity of Wigmore's chart is twofold: first, it keeps the whole case under the eye at once. We do not have to run back in search of discarded evidence. Second, it preserves what has been well described as the “granularity” of fact. Our duty, as judge or juror, is to postpone judgment until we hear all—but that is almost impossible; once we hear what we consider the clinching fact, we tend to measure all further facts against it, rather than weighing all together. The chart dissects and pins out evidence so that we can judge the local relevance and credibility of each fact—including the apparent clincher—before we move on to the case as a whole. As Wigmore said, it doesn't tell us what our belief
ought
to be; it tells us what our belief
is
and how we reached it.
No one, though, seemed willing to learn the complicated symbols, and Wigmore's
novum organum
fell lifeless from the press. Insomniac lawyers read it, legal scholars admired it, but it never revolutionized the analysis of evidence. Northwestern's course in Wigmore charting declined, once the Dean had retired, from required to elective—and then to the hazy limbo of summer school. The parallel with Bacon was closer than Wigmore had thought: it would take sixty years for his ideas, too, to be generally accepted.
Once to every man and nation comes the moment to decide, In the strife of truth with falsehood, for the good or evil side.
 
Well, actually, it is somewhat
more
than once: in even the simplest Wigmore chart, we can expect something like 2
n
moments to decide for
n
pieces of evidence. It doesn't scan as well, though; nor does it make it easy for us to maintain an open mind through the journey across that jungle of choices. Computers, however, have no such difficulty: they happily gorge on data; they willingly cross-reference facts; they fastidiously maintain the essential distinction between what is known and the weight attached to it; and they can perform Bayesian calculations perfectly. These abilities have brought computers to the doors of the courtroom and created a new form of investigative agent: the forensic programmer.
Patrick Ball, late of the American Association for the Advancement of Science, is essentially a geek for justice. He and his colleagues across the world have collected, organized, and statistically interpreted the evidence of some of the gravest human rights violations of the past twenty years, from South Africa to Guatemala, from Indonesia to Chad. Their results have supported the work of truth commissions and prosecutions alike: Ball himself appeared as a witness in the trial of Slobodan Milosevic.
Their task is to distill fact from anecdote, building from many sources a reliable record of exactly who did what, when, to whom. In El Salvador, for instance, careful cross-referencing of a mass of data was able to pinpoint the individual army officers most responsible for atrocities. Forced into retirement after the end of the civil war, some of the officers brought suit to overturn the statistical conclusions. As Ball told the story to a convention of hacker-activists: “So we went into court with what lawyers go into court with. That is, dozens of cases on paper. But we also went in with diskettes with my code, and we gave it to the judge and said, ‘Here's how it was done.' . . . They backed off and withdrew their suits. . . . The reason it worked? Big data.”
Big data and Bayesian correlations are also helping the police with their inquiries. Kim Rossmo was a beat constable in the rougher parts of Vancouver. He could feel the subtle sensory shifts that mark one block as different from another: the unseen borders between this community, this power center, and the next. He drew the obvious, literary parallel between the city and a predator's environment, but took it further: the repeat criminal is not just a raptor or stalker, moving at will through a mass of docile herbivores. He (for it is mostly “he”) also has the
constraints
of the predator: the desire to hunt with least effort and most certainty, the habitual association with a few places, the established cat-paths between them. Studying for his Ph.D. at Simon Fraser University, Rossmo took these ideas and began developing a set of algorithms to deduce the criminal's home territory from the geographical distribution of crime scenes:
Criminals will tend to commit their crimes fairly close to where they live. Now, there are variations; older offenders will travel further than younger offenders, bank robbers will travel further than burglars, whites will travel further than blacks . . . but the point is that the same patterns of behavior that McDonald's will study when they're trying to determine where to place a new restaurant, or that a government may look at in terms of the optimal location for a new hospital or fire station, also apply to criminals.
 
There is one important difference, though, and that's what we call a “buffer zone”; if you get too close to the offender's home the probability of criminal activity goes down. And so at some point where the desire for anonymity and the desire to operate in one's comfort zone balance, that's your area of peak probability.
 
Rossmo went on to be head of research for the Police Federation in Washington, D.C. His geographical profiling software is being used by police forces in several countries to deduce patterns from the scatter of facts across a map, correlating geographical data with the results from other investigative techniques. Its success depends on the knowledge that behavior is never truly random. Crime may be unpredictable, but criminals are not.
 
“You come into a room, it's full of blood; there's someone there with a knife sticking out of him. What you should
not
do is make a hypothesis. That's, I think, the greatest source of miscarriages of justice.” Jeroen Keppens unconsciously echoes the Talmud. “Instead, you look, say, at this pattern of blood on the wall; is it a drip or a spray? If you think it's a spray, what direction did it come from? You make micro-hypotheses, plausible explanations for each piece of evidence as you see it.”
Keppens is not someone with whom you would expect to discuss blood splatter: he is a soft-spoken, tentative, courteous young Dutchman, with nothing of the mean streets about him. And yet, as an artificial-intelligence expert, he is building computerized decision-support systems to help the police reason their way through the goriest cases:
If I attack you, there will be some transfer, maybe of fibers from my jumper, and there will be existing data to say how rare this fiber is, how much we would expect to be transferred, what rate it would fall off afterwards, and so on. All these generate probabilities we can assign to the fact the fibers are found on you. For other things like mixtures of body fluids the chain of inference is more complex, but they still let you build a Bayesian net, connecting possible scenarios with the evidence.
 
What we are doing is supporting
abductive
reasoning: deductive says, “I think I know what happened; is this evidence consistent?” and inductive asks, “Based on this evidence, do I think this scenario fits?”—but abductive goes further: “I have this evidence; does it suggest any plausible explanations? What other evidence would these explanations generate? What should I look for that would distinguish between these explanations?” You build out from what you see—and as forensic science becomes more complex, it's harder to be sure what the conclusion is. So putting in real numbers and doing formal inference calculations can be useful.
 
But how can a system where the probabilities are based on opinions—even those of forensic experts—be reduced to numbers? Isn't it guesswork dressed up as science? “I think scientists do understand the degree of uncertainty in any system; there might be an argument for using ‘fuzzy' sets representing words—‘very likely,' ‘quite likely,' ‘very unlikely'—rather than precise numbers. But the point is that right now, experts appear in court and use these same words
without
the calculations—it's off the top of their head—whereas Bayesian probability, even with the range of uncertainty in the data, can produce very strong likelihood ratios.”
This is not the beginning of push-button justice, but computers can spread the instincts of the expert more widely, giving every police force the same informed sense of likelihood. Keppens says: “In a small town or remote region, the first person on the scene of a major crime, the uniformed officer, has probably never seen anything like this before. The decisions made in the first five minutes are crucial—sometimes they make the difference between it being recognized as a crime or being ignored. Those are the kinds of decisions we want to support.”
Richard Leary holds the bridge between the theoretical, academic side of decision-support systems and the real world of the investigation room, walls covered with Post-it notes and desks with cups of cold coffee. Until recently he was a senior detective in the West Midlands police force, Britain's largest outside London; and he preserves the policeman's combination of closeness and distance, speaking openly but with deliberation, taking care to enforce his credibility.
He is describing FLINTS, the computerized system he created for helping detect high-volume crimes: “It's a systematized art form, sort of a bastardization of Wigmore, DNA profiling, and a little chaos theory.” The system works by generating queries, prompting the investigator to seek out patterns or connections among the evidence held in separate databases: DNA, fingerprints, footwear, tool marks. By supporting several hypotheses and determining the evidence necessary to confirm or disprove each, it helps point out the obvious-in-retrospect but obscure links between people and crimes that may lie hidden in a mass of data. Leary says:
Abductive reasoning requires the investigator to think carefully about what he's doing with the facts. Usually, there's plenty of information—that's not what we're short of. The real questions are, “Where does this information come from, how can we use it to formulate evidence, and then how do you use evidence to formulate arguments, then take those arguments from the context of investigation to the context of the court, all while still preserving the original context?” If you're going to do that, you can't just rely on gut instinct or on flashy technology—an investigator has to think in a methodological fashion, to have a conscious, logical system of assembling information into arguments.
 
The advantage of the Wigmore approach to systematizing investigation is simple: it points out gaps in available evidence. “All too often, an investigator only seeks new evidence to try and firm up a currently favored theory, rather than to discriminate between credible alternatives. Missing evidence and intelligence are often not considered in themselves. But you're actually trying to eradicate doubt—that's the purpose of investigation—and one of the best ways to do that is to look at the data you do have and then see what's missing in the light of each hypothesis;
then
engage in searching for that new data. It's common sense, but it's not common practice.”
The reason, according to Leary, is police investigation training, which is usually based on the law rather than logic. “For example, the national senior investigating officers' course, right now, concentrates on murder. Well, I don't understand what's the special
logic
of investigating murder as opposed to the logic of burglary or the logic of illegal drug trafficking. They are assuming that investigation is driven by law rather than by science and logic; their training material is subject-specific, not about developing methods of thinking.”
Do computerized reasoning systems provide the answer? Only up to a point: “Data is collected according to a fixed objective and it's arranged and presented systematically, so there's a golden thread of logic running through it. But data is never perfect; for high-volume crimes, for example, a lot is simply not reported. And when you have these very visual ways of presenting data—crime hot spots on a map—they are very persuasive. The visual pattern can skew the analyst's judgment. You can never just abdicate human responsibility to a machine.”

Other books

Sonora by Pastor, Juan
Her Charming Secret by Sam Ayers
Vampire World by Douglas, Rich
Jack & Louisa: Act 1 by Andrew Keenan-bolger, Kate Wetherhead
Free Gift With Purchase by Jackie Pilossoph
Radiance of Tomorrow by Ishmael Beah