What Do Airplane Pilots and Trustless Virtual Teams Have in Common?
Trustless virtual teams can be a hard pill to swallow for dispersed organizations.
Can we really find a system to cement trust and check it out on the list of virtual teams advantages and disadvantages in the modern workplace? Transparency and trust are critical when working on a global digital market and, unless supported by complex legal agreements, trust is assumed and often taken for granted.
But ties built for a long time during multiple joint projects which coexist with the comfort of the known can be a challenge for virtual teams.
When we are approaching a new project or person, our senses are heightened and we pay each task double the attention it deserves. In companies that use virtual teams, as work becomes a routine, staff vigilance reduces and they become more prone to human errors.
The comfort of solid, long-lasting, and successful virtual teams can increase the tendency for slips and biases. If an error can cost a digital project tens of thousands of dollars, a pilot’s error can cost hundreds of lives.
Why Pilots Never Fly Together More Than Twice?
There are plenty of myths about pilots. One is that they lead a glamorous life full of exciting travel to exotic destinations with generous perks and rewards. Another is that they let autopilots ride planes while they seat and do nothing. The truth is somewhat different. Pilots have rigorous training and working schedules which often affect their circadian rhythms and personal lives.
They need to take and pass regular strict medicals and be prepared to make decisions and give commands to a complex computer system. So, in a way, pilots are truly charismatic superheroes. But on the other, they are also human and make mistakes.
To reduce the odds of things going in the wrong direction, airlines have adopted the practice to disallow the same people flying together more than twice. Therefore, it might be peak time to borrow the wisdom of airline professionals whom we trust with our lives more than 100,000 times a day. How is this airline solution relevant to trustless virtual teams?
Typical Hindrances to Trust in Airline Teams
Chesley Sullenberger, the famous flying veteran who safely landed US Airways Flight 1549 in the Hudson River in January 2009 was indeed a superhero. However, he owes the happy end of the miracle flight that suffered a strike by a flock of Canadian geese to his team. He made wise decisions and put them in practice with the help of his copilot, flight attendants, passengers, and the New York Waterway crew. In his own words, “the successful landing was a result of good judgment, experience, skill — and the efforts of many.”
Taught by practical experience, airlines vastly invest in flight safety, and keeping two, sometimes even three pilots in the cockpit, is standard practice. The usual pilot crew consists of a captain, who is in charge of flying the plane, which means managing controls, steering the plane, and programming the autopilot; and a copilot or a first officer, who acts as a monitoring pilot, keeping in check everything the first officer does, communicating with air traffic control, and making sure all engine parameters are fine.
Both officers know their duties well, and thanks to their mutual effort, hundreds of thousands of flights per day reach their destinations.
Hierarchy in the decision-making process is well-defined, giving precedence to the senior pilot, but also leaving space for the copilot to challenge the captain’s decisions. Unfortunately, the airline history marks a few occasions on which copilots have failed to contradict the captain’s course of action, resulting in fatal accidents. A rigid hierarchy can often be a problem and cause inexperienced copilots to doubt their own reason, although facts from the field present solid evidence that they are right.
How Cognitive Biases Affect Pilot’s Decisions
Human cognitive biases play a major role in how people perceive a situation. There are over a hundred of them, and pilots are vulnerable to many. Common cockpit biases include the ambiguity effect, which means choosing the safer, more comfortable option between an old and a new course of action. Anchoring bias can make pilots rely heavily on the first piece of information they receive, discarding additional input with less value. Attentional tunneling can make them blind to what is happening outside a narrow attention focus.
Confirmation bias is a tendency to believe information that supports a current mental model rather than one that contradicts it. Similar to this is the selective perception bias, which makes it difficult for a person to accept and remember difficult of stressful information. Pilots can be also prone to automaticity or optimism, as well as to the “halo” effect and the false-consensus bias, which are even more important in a group setting.
Language can be an extra hindrance. Although aviation rules require full professional proficiency from pilots, not everyone speaks perfect English, and occasional misunderstandings are difficult to prevent.
Considering all these “faulty” ways of mental processing, it is clear that having two pilots in the cockpit promotes passenger’s safety in a big way. Moreover, two minds who are not used to a role think clearer than one and are more alert and dedicated to the task at hand. Therefore, airline companies avoid putting the same two pilots in the cockpit more than twice.
Transparency in Trustless Virtual Teams
Can two or more people working together in virtual teams in the workplace become too comfortable and adopt mental patterns that affect their decision making as individuals, as well as in a group? Simply put, yes. Agile teams suffer from their own set of cognitive biases, such as information bias, in-group bias, and the detrimental bias blind spot, which is the inability to spot one’s own biases.
In contrast, trustless virtual teams compile to work on isolated, exclusive projects. The way flash teams come together in companies that use virtual teams is by rapid, ad-hoc, intelligent mix-and-match expert combination system. Such a system ensures that members are liberated from the risk to make mistakes and biased decisions due to novelty, objectivity, and thoroughness.
Thanks to the concept of the beginner’s mind, known in Zen Buddhism as “Shoshin”, trustless virtual teams are less likely to fall into familiarity patterns and are more likely to be open, eager and non-judgemental.
Although the flash team technology enables setting clear hierarchical projects, this is far from the compact-style hierarchy typical to corporations and large organizations. Hierarchy in virtual teams is more project-based and task-based than role-based. Consequently, although a particular expert may have a leader’s position in one team, they can be just a collaborator in another. Such atypical, sporadic seniority eliminates the possibility of setting into a role and becoming an easy target to one’s own human frailties.
As one of the most important challenges of virtual teams, trust is important, but it shouldn’t be idealized. People are inclined to err and be biased, regardless of their professional acumen and task readiness.
In a group setting, biases and mistakes can amount to a whole new risk level. Consequently, even if people come to work with best intentions at heart, imperfection takes its toll and accidents happen.
If you need to compile a remote virtual team and worry constantly about how trust works in a concrete project, you’ll drain your resources quickly.
One way of solving trust and transparency in virtual teams is to let technology give you a hand. With their quick compiling characteristics, flash teams are a practical method to reduce many of these concerns and use your energy to increase the functional value of your products and services.