The earthquake that hit Haiti on 12 January 2010 unleashed almost unimaginable misery and chaos in the span of minutes. In the days and months that followed, many more suffered its grim repercussions. Along with the rest of the world, the United States was quick to respond, and soon the US military was leading the way. Within 48 hours of the earthquake, the US Southern Command had established Joint Task Force-Haiti, whose leadership team coordinated a vast and complex effort. For the next nineteen weeks, JTF-H led Operation Unified Response, our military’s longest and largest ever disaster relief effort on foreign shores. At its peak, 22,000 service members, 58 aircraft, and 23 ships were involved, along with a vast amount of supplies and equipment, not to mention support from many quarters.
Many were moved to contribute. Shortly after the earthquake, an MIT colleague emailed to ask if anyone would step in to help his defense-contractor contacts working with the response efforts. Hoping to contribute in some small way to the response, I volunteered—my desire to help certainly outweighed my expertise. Before explaining how much I learned about working efficiently amid uncertainty and urgency, not to mention about the almost impossibly hard work of post-disaster efforts, I’ll set the stage by describing the collaboration, its goals, and its operational methods.
In early February 2010 my small MIT “away” team started its collaboration with military and humanitarian experts working with the Joint Task Force. We aimed to contribute to a larger effort from MIT Lincoln Laboratory, and soon were working directly with them, MIT humanitarian logistics experts, military personnel, and the Haitian arm of Boston-based non-profit Partners In Health.
Our team aimed to bring varied methods, ideas, connections, and research efforts to the practical challenges facing the humanitarian response in Haiti. In February, shortfalls in communications, electricity, fuel, food, water, and other inputs persisted across the shake zone and beyond. The responders’ efforts were stymied by these very constraints, yet without informed action they would fail to alleviate the problems. A key question was: how to collect and share real-time information on current needs, so that the JTF-H leaders and their colleagues could supply what was most needed? Over two million Haitians had been left without shelter. Many had no phones and there were few ways of knowing who was where. People moved from place to place as conditions changed.
Our collaboration lasted just three intense months, from February to April. It provided me with one of the most vivid learning experiences of my life. Imagining the potential cost of a misstep drove home the need for every action to be effective: what we did in a conference room in Cambridge, Massachusetts mattered, because there was no time to waste. Responding well to a large-scale disaster, I learned, poses the starkest of challenges—the situation changes constantly, stakes are high, and information is hard to come by. It’s obvious that the most crucial thing is to prioritize actions, yet the past and future must factor in at every step. To do right by the Haitian people, we needed to appreciate the pre-disaster situation and its historical context. We also needed to think ahead to what could follow the emergency response, knowing that every disaster relief operation has the potential to set the stage for subsequent recovery or instead to create new problems that would become evident only later. Combining it all in the right mix seemed, to me as a neophyte, a near-impossible task, and I quickly grew to appreciate that the people who can carry out this work warrant our gratitude and admiration. In those early weeks thousands worked heroically on the ground in Haiti.
Our MIT-based team gleaned all we could from phone calls (often rushed, interrupted, or garbled), video conferences, our own site visits to Port au Prince and environs, interviews of knowledgeable informants, and of course much research. We consulted a vast range of existing sources for information, talked to experts in disciplines from child nutrition to data mining, downloaded datasets, analyzed spreadsheets, and shared updates and work products on a private website. Our aim was to contribute to the evolving plans for collecting and making sense of the data most needed to serve the millions in need.
The specific results of the overall effort and the data collection project are documented in the US Army’s Center for Lessons Learned archives and in other reports, including a paper published later that year in Military Review, that present the team’s innovations in humanitarian assessment along with recommendations for future disaster response. We’ll focus here on a few things I learned as a team member: new disciplines and practices that could help in all kinds of project teams.
In those early post-earthquake weeks, with our loose team of on-the-ground military personnel, defense contractor experts, leaders of Haiti-based organizations, and personnel from multiple universities interacting around the clock every day, there were many emails. At the suggestion of the team’s mentor, who was an experienced military leader, there was one main daily email. Each email included a paragraph that reminded the team of two crucial elements: the goal of the overall mission, which had been dubbed Operation Unified Response, as well as the specific aims of our team. These were listed crisply using simple formatting: each key idea got its own brief line and was indented by tabs to indicate where it fit in the plan. It was a quick verbal and visual guide to what we were all focusing on. It even managed to telegraph the hierarchy of steps and results. The formatting was simple enough to work in any email reader, and the entire paragraph was short enough that it could be read quickly. The indentation drew attention to the causal logic behind the entire project: steps were shown clearly, then the phrase “so that…” flagged specific objectives. The language was brief, direct, and jargon-free.
This simple technique made our mission salient. In the early weeks, the part of the paragraph that presented the team’s goals and means-ends hypotheses was refined a few times as our focus developed. Because it was easy to find at any moment, and easy to read and remember, we could actually use it to check our work. Sometimes we would invoke it several times in a single day when discussing the task at hand. Were we focusing on something that would contribute to the goals? Were the results we were seeing in line with the cause-and-effect linkages laid out in the email footer? As one remote component of a large, fast-moving team, this method helped prevent wasted effort and enabled us to identify the most important findings to share with others.
A second technique facilitated this sharing. Every Monday evening, the Commanding General of Operation Unified Response was briefed. A short spotlight briefing would follow the main presentation, and our broader team would put these briefings together every week. The aim was to use the 20 minutes to maximize utility for Commanding General, or CG, who made daily and weekly decisions. So we followed a standard format that I imagine is common across military settings.
The cover lists date, status (“unclassified” in our case), title, and the names—usually at least a dozen—of the key team members and their affiliations. This would make follow-up easy. Page two was called the BLUF slide: Bottom Line Up Front. It would list in a few bullet points the conclusions that would provide the basis for the CG to make decisions. The rest of the presentation would explain and provide specific evidence for the BLUF points.
Rigor and logic were the watchword in preparing the briefing deck. If the goal was to inform the key decision-maker, every point would need to be supported with the strongest possible analysis that made the supporting data vivid, offered some basis for comparison or assessment (for instance, by mapping trends over time or comparing camps), and cogently accounted for limitations and open issues in the analysis. Graphs, schematics, photographs, and quotes were all used to back up the specific points, making for as well-rounded a presentation as possible.
Knowing that we needed to create actionable points for the BLUF slide gave the team focus for the entire week. We were all motivated to show week-by-week progress, for one thing. Second, discovering something that would not help the CG make decisions for the coming week was of no value. When we read books on the history of Haiti, dug into our datasets, or examined how to plan for latrines, this provided a sharp focus—no small feat for academics!
A second aspect of the BLUF page fascinated me. The rule was, our CG could call a stop to the presentation once that second page was shown. He could do so for any of three reasons: something more urgent demanded attention; the key points were already accepted (perhaps because they were obvious?) and there was no need to delve into the background then and there; or the points were sufficiently irrelevant or off base that it would be a waste of time to go further. Understanding that this was the norm provided further focus for our work. We didn’t want anything we did to be too obvious or irrelevant. We also appreciated that there could be times when the audience in the room had more important things to do, so the presentation could be cut short without sacrificing the punchline. Knowing that when the entire deck was shown it would be because the CG was choosing to spend the remaining 19 minutes to consider our work was also motivating. We all knew that time and attention were at a premium in all activities: the same principles should guide even the most formal and routine events. Cut anything that is not a good use of time, that does not contribute to the mission and the objectives. The consistency of applying these principles helped the entire team to feel motivated and focused throughout the loosely organized and often chaotic effort. It also supported our humility as part of something much larger.
Taken together, the briefings tell a story of the project. The team mined their experiences to draw lessons learned after the operation’s stand-down on 1 June. One way we did this was via an all-day multi-stakeholder after action review that took a no-holds-barred approach to identifying what we learned. Key insights were distilled in written reports. The insights are informing ongoing efforts to better prepare for the next disaster. And now I use the BLUF approach, the focused mission statement, and after action reviews whenever it makes sense.
photo source: http://humanitarian.mit.edu/projects/haiti-needs