How experts are helping break the expert evidence logjam

The logjam is between the view of experts and their evidence as held by the legal community and the view as held by experts themselves – one negative and not so well founded and the other positive and more evidence-based.  Experts are breaking the logjam by speaking out, telling it like it is not how the court perceives it to be.

We should all read Ruth M. Corbin’s excellent paper, Breaking the Expert Evidence Logjam: Experts Weigh In, and some of the 65 citations, on the disconnect between the court’s perceptions of experts and the views of the experts themselves. (Ref. 1)  It’s interesting, and surely embarrassing, that the court’s views are not so evidence-based.

Read the themes and gaps in perception that Dr. Corbin found in a pilot study:

  • that included 152 experts who have testified in Canada – that’s a lot
  • then reflect on the cause of this disconnect – I’ve got my views
  • see ideas for fixing the problem that resonate with courts and experts alike, and,
  • read Ruth’s call for additional, more quantitative studies to firm up her findings.

I heard Ruth present her paper at the recent Expert Witness Forum East in Toronto on February 27 and read and studied it three times since.  Its well written – no technical and legal jargon here – and a good and informative read.  You can Google the paper’s title and her name.

The Paper’s Abstract

The paper’s Abstract is a good introduction (I took some liberties with Ruth’s abstract and broke it up into more paragraphs, and commented here and there.  I also tabulated the future steps):

    “Expert evidence is perceived by many as inherently suspect.  Effort world wide is being directed to improving the process by which valid and reliable expert evidence is delivered to triers of fact.  Curiously, experts’ input on process improvements has not been solicited. (That’s quite a revelation in my view!)  One is struck by the paradox that experts continue to publicly acknowledge an expert’s duty to the court and continue to swear oaths to that effect, and courts continue to disbelieve them.  Even with the wave of new rules about expert testimony in dozens of jurisdictions, the perception of a problem has not gone away.

“A research project was carried out among a broad range of Canadian experts (and a good sample size at 152) to identify gaps, if any, between the perceptions of experts and courts…….Five compelling themes emerged from the research (and six gaps between the views of experts and legal people), highlighting ambiguities and inconsistencies in interpretation of the expert’s duty.

“The paper concludes with opportunities for next steps in three domains:

  1. Empirical research to strengthen the evidence-based foundation of future policy
  2. Economical modeling to complement the Supreme Court’s call for a “cost-benefit” analysis of expert testimony (I believe, based on my experience in Atlantic Canada, that this modelling and analysis must include an identification of principles governing the cost control of civil litigation involving experts), and,
  3. Practical steps toward creating a forum for direct communication between experts and courts (The duties of the middle man in the process, the advocate, have got to be modified a little)”   

Compelling Themes From Experts’ Comments

The five themes presented below are those topics most frequently identified by content analysis of written and interview-recorded input from the 152 experts.  Content analysis is the objective categorization of descriptive text into common themes.  There’s in-depth comment on each of the themes in the paper:

  1. Duty to the court is universally acknowledged.  The concept, or even the explicit phrase, “duty to the court”, was universally acknowledged by the experts.  No one thought otherwise.
  2. Mis-perception of motives,  “It’s not about the money”, volunteered many experts.  They rejected this view that experts are motivated by money, that they’ll say whatever in court to maintain a revenue stream.  The most frequent motive expressed was the interest and challenge of solving difficult problems for which their expertise was needed and valued.
  3. Mixed signals from the courts: Independence, neutrality and opining on the ultimate issue.  Duty to the court was understood to entail principles of independence, objectivity and refraining from opinion on the ultimate issue.  However, experts who looked to court decisions found these principles to be ambiguously interpreted.
  4. Risky surrogates of credibility and common sense.  Experts acknowledged that they had seen opposing experts take what they considered biased positions.  “Rogue” experts may have the charisma and comportment to have their opinions preferred by the courts, to cause judges to make errors in evaluating scientific evidence, based on “common sense”.
  5. Appealing alternatives to adverse testimony with cautionary words.  It was widely observed that malfeasance should not be automatically presumed when experts disagree on interpretation of the same facts:  Collegial debates are endemic to academic life and professional forums.  Consistent with that view, hot-tubbing was met with widespread support among those whose views were canvassed.

Gaps in Perspective Between Law Professionals and Experts

Ruth’s paper tabulates the discovered gaps between published decisions and legal commentaries, and experts’ own views sourced in the course of the research presented in her paper.  The gaps are identified based on qualitative content analysis.  The differences in the published principle or presumption of the court and the compelling themes from the experts highlight the gaps for each of the following issues:

  1. Objective value of the expert’s evidence.                                                           Gap: Experts are conscious of their duty of objectivity contrary to how the courts perceive them
  2. Independence and objectivity.                                                                               Gap: Similar to previous.  Part of the problem is the court’s own confusion interpreting these terms.  Experts know what they mean; it’s interesting that courts don’t
  3. Assessment of an expert’s credibility.                                                                 Gap: Charisma and comportment in court are trumping scientific evidence
  4. Common sense standard.                                                                                   Gap: It plays a part in court but an understanding of what it is varies.  It should not override science, regardless of what it is
  5. Motives of experts.                                                                                               Gap: Experts are driven more by curiosity and solving a problem than by money
  6. Alternatives to adversarial evidence.                                                                     No gap here; a meeting of minds on reducing adversarial testimony with techniques like hot-tubbing. (Ref. 2)  This is the consensus-building north not the adversarial south

Summary

There is a logjam but the experts are helping to break it with objective comment and Ruth’s evidenced-based help.  The jam doesn’t reflect well on the court’s too subjective, confused assessment of issues at times and their susceptibility to undue influence from understandably biased players in the judicial process.

The experts are unjustifiably getting the short end of the stick in the process – perceived badly – but 152 experts can’t be wrong.  Their near universal understanding of objectivity, independence, what it means to swear an oath, and that they serve the court – no one else – is clear to the experts.  Less than 10% of experts have been found in rulings and comment to be biased.  What part of this understanding doesn’t the court understand?

The court’s perception of the expert is filtered through the advocate who presents everything from the expert to reflect best on their client, as s/he must.  The opposing advocate does the same.  The big picture is confusing and messy to the judge, particularly if it’s a scientific issue, and the expert is at the centre of it.  What’s a poor judge to do?  No wonder they have a jaundiced view of the expert.

But, Ruth’s research is setting the record straight with evidence-based data from experts, and hopefully more to come from bigger, more quantitative studies.  A judge need only read, listen and learn from the objective experts because we tell it like it is..

***

(A lot of the above has been taken from Dr. Corbin’s paper as I understood it.  Her paper on the hot-tub alternative to adversarial expert evidence is also very informative.  See Ref. 2 below)

References

  1. Corbin, Ruth M., Chair, Corbin Partners Inc. and Adjunct Professor, Osgoode Hall School, Toronto, Breaking the Expert Evidence Logjam: Experts Weigh In, presented at Expert Witness Forum East, Toronto, February, 2018 (Google it)
  2. Corbin, Ruth M., The Hot-Tub Alternative to Adversarial Expert Evidence, The Advocates’ Journal, Spring 2014. (You can Google it too)

Are experts being broadsided by bias, unbeknownst to them?

You might wonder, how is it possible for experts to avoid bias when there are so many types, like dozens?  Google “bias” and see what I mean.  I saw one link where the dozens were categorized under the letters of the alphabet.  Fortunately, it’s possible to do something about bias by learning about the key ones that can trip us up.

Speakers at the recent Expert Witness Forum East in Toronto identified categories of bias that experts must be alert to.  I was surprised by the number and the fact that some bordered on deliberate. (Ref. 1)

Experts must get familiar with the ones that can show up in our investigations and evidence.  This is a first step in rooting it out, and not being broadsided by a peer review, a rebuttal report or a cross-examination.  It won’t matter to those few given to deliberate bias.  But there are the rest of us, the great majority.

Two presentations on the first day of the Forum introduced us to implicit bias.  Initially with a talk on Addressing Implicit Bias On and Off the Stand (Ref. 2) and then with an Interactive Session. (Ref. 3)

Dealing with implicit bias

The interactive session made it clear that in dealing with implicit bias we must:

  1. Understand implicit bias
  2. Identify implicit biases
  3. Reduce the influence of bias
  4. Mitigate for the bias of the audience

Categories of bias in expert evidence

A presentation on the second day of the Expert Forum identified eight (8) categories of bias in expert evidence: (Ref. 4)

  1. Selection bias (Hired guns)
  2. Association bias (Advocacy)
  3. Professional bias (Self-interest)
  4. Data bias (Collection/Analysis/Availability)
  5. Hindsight bias (Preventable outcome)
  6. Noble cause distortion bias (Societal good)
  7. Expectation bias (Anchoring)
  8. Confirmation bias (Tunnel vision)

The speaker then went on to focus on the last two in the list and elaborated as follows:

Expectation Bias (Anchoring)

  • The focus is on a particular observation/theory/information provided during early stages of investigation that prematurely predicts outcome and thus influences methodology and future decisions
  • Behavioral sciences show that human judgement is powerfully affected by how problems are initially framed since humans are known by nature to unconsciously anchor on details they are initially given
  • Requires additional experience beyond the first engagement with a lawyer, which can frame one’s thinking and becomes their frame of reference
  • Inexperienced experts may not recognize when “relevant facts” are in the eyes of the client or litigator.  The expert should request all facts be made available, particularly submissions from opponents

Confirmation Bias (“Tunnel Vision”)

  • A most insidious, subconscious tendency of those desiring a particular outcome to search for supporing evidence and/or ignoring or reinterpreting contradictory information
  • Often develops from Expectation Bias (Anchoring)
  • Scientists and engineers favour report findings consistent with their prior beliefs and expertise
  • Confirmation bias requires a theory, goal or outcome to generate an attraction for bias

Examples of bias in Nova Scotia

Public examples

Expectation bias and Confirmation bias figured in the forensic investigation of the fatal accident of Janice Johnson in Nova Scotia that resulted in her husband, Clayton Johnson, going to jail for five years falsely accused of murdering her. (Ref. 4) The case was described in the presentation along with two others elsewhere in Canada, one a murder that was disguised as a suicide and another, a car accident.  I gathered from the presentation that expectation bias and confirmation bias figured in the faulty investigation of the latter two as well.

Personal examples

I have my own examples of bias in civil litigation.  One, a visual assessment of slope stability from the comfort of the investigator’s car, disparagingly called “a drive-by” evaluation in the real estate business and against some of their regulations.  And another, a critical assessment of the soil conditions at a site without a site visit.  I peer reviewed the technical reports on both these cases.

It was easy to conclude the “drive-by” type assessment at the one site because the poor construction and unstable slope were there to be seen by walking across and down the slope, particularly at the toe.  The slope was actually dangerous to walk across at the bottom.  The wording of the report exhibited Professional Bias (Self Interest), with an eye to the next forensic commission.

My second site was easy too because even non-technical people know you can’t define a surface properly with two points – simple high school geometry – and that would have been obvious with a site visit.  How do I know there wasn’t a site visit?  Because it wasn’t mentioned in the report I reviewed.  It’s an important task that would have been reported in detail in an expert’s report.

***

Getting familiar with bias will increase the chances that your case – expert and civil litigation lawyer alike – is not broadsided by peer review, rebuttal or cross-examination.  You can’t sabotage the dozens of different types of bias but you can learn about the few that creep into forensic work.

(Note: The numbered and bulleted lists in my blog were taken directly from the references)

References

  1. Jorden, Eric E., Expert witness forum looks at bias and other touchy subjects in forensic work.  Posted March 6, 2018
  2. Virji, Aly, Staff Sergeant and Moosi, S. Ali, Constable, Toronto Police Service, Addressing Implicit Bias On and Off the Stand, 3rd Annual Expert Witness Forum East, Toronto, February 27, 2018
  3. Duncan, Peter, Instructor, Toronto Police Service, Addressing Implicit Bias: Interactive Session, 3rd Annual Expert Witness Forum East, Toronto, February 27, 2018
  4. Perovic, Doug, Professor, Materials Science and Engineering, University of Toronto, Raftery, Barry, Forensic Engineer, Raftery Engineering Investigations and Lockyer, James, Lawyer, Lockyer Campbell Posner, Mock Trial, 3rd Annual Expert Witness Forum East, Toronto, February 28, 2018