Mary Cirincione raises some timely questions regarding the intersection of advancing, increasingly more complex technology and the practice of justice in her article published in Phys.org.
Seeing how DNA tests have recently become a staple in popular TV shows like Law and Order and CSI, Cirincione asks whether forensic science is really that clear-cut and infallible as one might come away thinking it is.
“These samples, known as low template or low copy number DNA, often degrade in quality once they’re replicated for testing. Mixed sample DNA presents similar problems because it contains genetic material from two or more people, and each must be isolated before being matched.”
But what about touch DNA, or tiny amounts of genetic material left on an object such as a glass or ballpoint pen? According to Cirincione, while this type of evidence, which sometimes consists of just a few cells, can be replicated and tested, finding a conclusive match is another story altogether.
A number of trials where DNA material was used as primary evidence – such as the case of Amanda Knox, who was tried and subsequently exonerated twice due to disagreements over the quality of evidence – highlight the complex nature of using human genetic residue in court.
“This is something we are trying to figure out ourselves,” said Paul Cates, Communications Director for the Innocence Project, a non-profit legal group that relies on DNA evidence to exonerate wrongfully convicted people. “We realize that there’s a lot of discussion about (low template number DNA) in the scientific community and we are doing our own research to figure out where we are on this.”
Via analysing a single strand of hair, a mouth swab or a tiny amount of blood – called high template DNA – any qualified lab technician can produce results that are definitive and virtually unchallengeable. Relying on the aforementioned, less robust types of genetic residue, on the other hand, can lead to false convictions and undeserved pardons.
Working with low template DNA, analysts are forced to make educated guesses that are far from conclusive. “How can we be assured that we’re making the right call? Procedures for analysis and interpretation are not black and white,” commented Ohio Northern University forensic expert Brian Meehan.
A traditional test involves high peaks and low points associated with how closely a sample matches an individual’s genetic markers, and it’s up to the analyst to establish an “analytical threshold” by drawing a line across those peaks and valleys to isolate “good data” from “bad data”.
Cirincione claims that while new advances in technology may help standardise these testing procedures, eliminating most of the guesswork through the use of computers, they are unlikely to be any less open to controversy – how, one might ask, for instance, can a defence attorney cross-examine a software system?
Hard questions such as the one above notwithstanding, computerised DNA interpretation systems are already gaining in acceptance. With science constantly reaching for higher ground, old problems are likely to become less relevant and new ones to creep out of the woodwork – expecting a linear progression towards an objectively “better” future have been shown time and again to be without any convincing basis in reality.
Given that problem solving has been one of the core features of humanity from its inception, however, grappling with new issues is something we can’t not do. Only time will tell what all of this will mean to the praxis of justice in the near future.