Where Leaders Fail, Part Three: Hubris
I just finished watching the excellent NOVA special “Volcano’s Deadly Warning” on PBS. It’s a story of two scientists and the consequences of their differing opinions on how best to predict when an active volcano will blow. That topic may not sound like someplace where leaders could learn a lesson, but the way NOVA tells the story brings out the ways that human nature can compromise any collective endeavor.
The problem with volcanoes, you see, is that we never know when they are going to erupt. You might think that it would be fairly obvious — the earth would shake, rocks start rolling down the slope, and so forth — but that’s more Hollywood than reality. In reality, all the action that matters in a volcano happens deep underground, far from where human eyes can see — and once the earth starts shaking and the rocks start rolling, it’s too late to evacuate. This makes volcano prediction a very high-stakes game indeed. If, say, 28,000 people are sitting in the shadow of a volcano, after all, you can’t just shove them off their land every time there’s a remote possibility of eruption; the economic dislocation would be too great. No, you have to wait until you’re sure that the volcano is about to erupt — until you’re sure it’s not a false alarm. But wait too long, and those 28,000 people are going to be buried under six feet of mud.
That’s exactly what happened on November 13, 1985, when a Colombian volcano named Nevado del Ruiz erupted. Volcanologists had known that Ruiz was an eruption risk since the year before, but nobody could say when it would erupt, so the government refused to evacuate. When Ruiz blew, the 28,000 people in the nearby town of Armero were caught in the path of the mud and magma it spewed forth. 25,000 of them died.
The tragedy of Armero led volcanologists around the world to redouble their efforts towards finding a way to predict eruptions. Two scientists who led the way each developed methodologies that they thought would allow them to reach that goal. One, Dr. Stanley Williams, said that to know what was going on within the volcano, you had to go to the volcano; his method was to take measurements of gas emissions rising out of the volcanic rock, on the theory that higher emissions meant more activity and therefore a higher probability of eruption. Another, Dr. Bernard Chouet, worked a different angle: he said that you could detect an imminent eruption not by trooping around the volcano taking readings, but instead by watching the seismic activity generated by the volcano. Chouet said that, if an eruption was imminent, you’d see steadily growing numbers of “long-period events”, special seismic events that indicated that channels within the volcano were throbbing harmonically as magma pulsed through them.
The next few years seemed to bear both men out: their methods appeared to work equally well at predicting eruption. But then, in 1993, tragedy struck: Williams organized a volcanology conference on the slopes of another active volcano, also in Colombia — Galeras. Galeras had not long before had a small eruption, which made it interesting from a scientific perspective. What was more interesting, though, is that Galeras saw the Williams and Chouet methods diverge: Williams said that the mini-eruption had vented enough gas to calm Galeras, while Chouet warned that the long-period events were continuing to occur faster and faster, indicating another, much more significant eruption on the horizon.
The highlight of Williams’ conference was to be a trip down the crater of Galeras, into the heart of the volcano itself. For the assembled scientists, this brought the divergence in the predictions into stark relief — who would take such a trip if the volcano were due to erupt again any minute? Williams, however, argued that he had been into the crater himself, just before the conference started, to be sure that his gas measurements were as up-to-date as possible, and all the indications he had pointed to the volcano being quiet (Chouet was not present to make a counter-argument; an employee of the U.S. government, he was barred from visiting Colombia due to that country’s civil war). So, when the time came, twelve scientists and three Colombian tourists trooped up Galeras as planned.
The result was catastrophe. Four hours after the Williams team descended into the crater, Galeras erupted spectacularly. Scientists and tourists alike ran for their lives as refrigerator-size chunks of rock were blown into the sky. Some made it back; Williams, for example, was retrieved by rescue crews, though his legs were so damaged it would be years before he would walk again. Most, however, were not so lucky; of the fifteen who climbed the mountain that day, nine never returned.
The leadership lesson here is a simple one. One of the primary dangers leaders face is hubris — an arrogant sense that only they have the answers, and that naysayers to their plans speak from ignorance. Hubris causes leaders to close themselves off from questions or criticism, which is terribly dangerous because questions and criticism can help flag incorrect assumptions early, before they cause too much damage. When hubris strikes, those assumptions go unchallenged, causing people to expend great sums of blood and treasure defending untenable positions. Hubristic leaders have caused their followers to march into disaster time and time again throughout history, each time assuring everyone in sight that they will be different, they know what they’re doing, even when all the evidence points in the other direction. Evidence? Who needs evidence when you have certainty?
Stanley Williams’ example illustrates starkly the dangers of hubris. He and his fellow scientists knew of Chouet’s predictions, and they knew that the Chouet method had a good level of accuracy. However, Williams’ data seemed to him to contradict Chouet’s conclusions. Now, given differing conclusions, each with a reasonable chance of being the right one — call it 50/50 — when lives are at stake, the logical decision would be to play it safe and stay out of the volcano. However, Williams wasn’t thinking logically; he was thinking hubristically, and to the hubristic leader the answer would be obvious: if Chouet’s method disagreed with Williams’, Chouet must be wrong. Chouet’s predictive record to that date held no weight next to Williams’ observations. In other words, Williams was willing to be convinced if the data he gathered, processed through his method, indicated danger; but what he wasn’t willing to consider was that his entire method might be wrong, that it didn’t matter what his data indicated because he just didn’t know how to read them.
Stanley Williams paid for his hubris with smashed legs and the blood of nine colleagues on his hands. Regardless of the field, this is the usual way stories of hubris end. That’s something to keep in mind next time you’re leading people, or the next time a leader looks you in the eye and asks you to trust him or her with your money, your property, or your life.