One night in early September 2009, two Taliban trucks carrying fuel got stuck on a river bank in Northern Afghanistan. They had stolen the trucks from NATO forces.
Scores of locals began arriving at the scene, siphoning off oil. But soon after, an American fighter jet bombed the tankers, responding to a call from German coalition forces.
Up to 142 people died in the attack, most of them civilians.
A subsequent investigation ultimately cleared the German military officers who had called in the air strike of any wrongdoing. The German Federal Prosecutor found that neither of them were aware of the presence of the civilians.
Under international military law, judgments about whether military commanders have acted proportionately can only consider what information was available to them at the time the decision was made. The actual outcome of their decisions cannot be brought into consideration.
It’s an important distinction, because outcomes can be influenced by random events and other factors outside a military decision-maker’s control.
Making that distinction is also, according to Melbourne Law School academic Dr Inbar Levy, easier in theory than in practice.
“When outcome bias occurs, people tend to say that a behaviour was wrong or unlawful if the outcome was negative, and vice versa,” Dr Levy explains.
“This is a bias, because a behaviour in this context should be judged according to an objective standard and not according to the result, because the result might be random, or due to luck.”
Dr Levy teamed up with Professor Tomer Broude of the Hebrew University of Jerusalem to research outcome or hindsight bias as it relates to judging whether military decision-making is proportionate or legal.
The case of the Kunduz airstrike on the stolen truck is the inspiration for one of three hypothetical situations the researchers gave survey participants.
In each case, they told some participants that the outcome was positive — that no civilians died — while others were told that the outcome was negative — that civilians did die.
In other words, participants were given the same set of facts, but two drastically different outcomes. If participants responded the way international military law wants them to, then the outcome would theoretically not influence their judgement.
It didn’t quite work out like that.
“We found bias across all groups,” says Dr Levy. “But that’s not surprising, right? Because everyone is human.
“Most of our participants were affected by the results to some degree, meaning that if the result was negative, they said the decision to attack was unlawful, and if the result was positive, they said it was lawful.”
Where things got interesting was when the researchers were figuring out who was least susceptible to bias.
They divided their participants into four groups: students in Israel with field combat experience, students in Israel without field experience, students in Melbourne and Australian Defence Force (ADF) officers.
They found that the ADF officers did better at not judging the situation according to its outcome, but that students in Israel with field combat experience were the least affected by outcome bias.
“What it shows, basically, is that experience matters, and that we can perhaps develop expertise and help decision-makers make better decisions. It’s not random. People with relevant experience are able to handle their biases better, sometimes.”
The result chimes with a general concept in psychology that if someone is an expert in a certain field, they’re less likely to be affected by biases, simply because they’ve seen an event or scenario many times before.
“The most obvious example that is always given for expertise is that of chess players,” Dr Levy explains.
“They can memorise 10,000 to 20,000 positions on a board. They don’t use slow calculations for each move but, instead, they just use their intuition to make quick decisions. It’s a bit surprising because one might think that intuition leads to a bad result because it’s not deliberate or logical. But actually, if you’re an expert, your intuition is informative and relevant.”
This study is relatively unique because active military officers took part.
“It’s because, in Israel, everybody must go through the army (with some exceptions), and here the ADF was very open to collaboration. They were quite transparent about the decision-making process.”
So, where to from here?
Dr Levy says it would be wrong to simply suggest experienced officers are better adjudicators of decisions and leave it at that — after all, overconfidence bias can be just as strong in decision-makers as outcome bias.
“I think there’s an opportunity here for adjudicators to develop, and be better, which might lead to more just results. But it’s complicated. Another study shows that if people know they’ll be judged in hindsight, it will cause them to behave in safer ways.
“You’re less likely to take risks if you know you’re going to be judged after the fact. That might be problematic if it means a military commander is afraid to make decisions that are riskier.”
One solution might be to strengthen the exposure of decision-makers like those within the Inspector-General of the ADF, who review these sorts of cases, to both positive and negative outcomes.
“Being in the field tended to be the variable that had the most effect. Maybe we can sit down with officers and think about ways to make them feel what it’s like in the field. There are all kinds of suggestions around psychology and virtual reality, for example.”
It’s a complicated field of law, but Dr Levy says that’s what makes it interesting.
“It’s a big question in public international law currently: when is a decision to attack lawful or not? It’s only becoming more relevant as warfare becomes asymmetrical.
“It’s not an army versus an army, and the combatants are sometimes unidentified. That raises more difficult questions.”
It also raises questions for everybody dealing with a world where chance plays more of a role in determining events than we are maybe comfortable with.
After all, it can be “very tempting” to judge decisions based on their outcomes, Dr Levy says, and come up with comforting reasons as to why bad things sometimes happen.
“It’s very hard for us to conceive a world where events are unpredictable.”