I wasn鈥檛 in the room where it happened, but I experienced the consequence.
On Saturday, Jan. 13, 2018, somebody pushed a button in Hawaii that sent an 鈥淓mergency Alert鈥 text message to cell phones statewide:
鈥淏ALLISTIC MISSILE THREAT INBOUND TO HAWAII. SEEK IMMEDIATE SHELTER. THIS IS NOT A DRILL.鈥
The warning was also automatically broadcast on television and radio. It caused widespread panic. Residents and visitors huddled in closets and basements. Tourists vainly sought clarification from their hotels about what to do. People hugged their children and called their loved ones. Thirty minutes later, another alert declared a false alarm.
The visibly shaken governor appeared on television with the equally discouraged Hawaii Emergency Management Agency administrator and declared, 鈥淭his should not have happened.鈥
He said, 鈥淎n employee pushed the wrong button.鈥
There will be an investigation and officials will be updating procedures, but let me suggest that some聽聽guidelines and聽聽procedures might have helped avoid the whole thing in the first place.
Here鈥檚 what I know from reading press reports:
- The accident occurred at a shift change.
- During the shift change, there is some kind of routine run-through of a procedure very similar to the actual alert procedure.
- In an actual alert, there is a button to click on the screen that will cause the warning to be generated. One person is responsible for pressing this button.
- There is a confirmation dialog box that follows the button press.
- There is no standardized false alarm alert mechanism.
We can talk all day about hindsight, but the human-centered design process in software development is there to provide foresight, and every one of the five points above is a red flag for practitioners experienced in its application.
Prioritizing People
First, exactly what is human-centered design?
It is an approach to system development that prioritizes the experiences of the people who will be using the system. It takes into account how people perceive information in all senses; what people are capable of doing physically with their hands, fingers, eyes and whatever else they are using to interact with a computer system; how people process information, what they can remember and what taxes their information processing capabilities or confuses them; how human feelings and emotions affect performance and attention; how the context of people鈥檚 activity influences what they think and do; and so on.
A human-centered designer is therefore someone with knowledge about the behavioral, cognitive and and physiological sciences, who also knows about the design of interactive computing systems. They should be part of any development team, and play just as important a role as the best software engineer or programmer.
The Human(s) In The Loop
Now, to the points above.
- The shift change. Think to yourself what is happening at a shift change. The activity is simple to describe: one person is leaving and another is taking over their job. From a technical perspective, there will be a bunch of steps for transferring the activities. But think about it more deeply from the human perspective. How do you feel at the end of a shift? Anxious to get going? In a hurry? What鈥檚 on your mind? The drive? Picking up the kids? Stopping at the store? This is a classic situation where people are distracted, inattentive, and cognitively overloaded. It is a common place where errors happen.
- The routine run-through. A typical shift change involves following a checklist of actions. This sounds like a great idea for making sure that people complete every step, but again think deeply about how humans master routines. The human brain is an automated routine learner, and the point of learning a routine is to free up attention. When you were first learning to drive, every movement was an agonizing effort and all attention was on the placement of your hands, the pressure of your foot on a pedal, where you should be looking, and what the next step was. Once you became an expert, however, you could drive 鈥渨ithout thinking.鈥 Now, when you drive your attention can be elsewhere and you can sometimes complete a journey without even remembering what you are doing. But this routinization comes with a cost. Have you ever driven to the wrong place because you weren鈥檛 paying attention? Your driving was excellent, but the whole process was on automatic and the destination was wrong. And, by the way, the wrong destination was a well-practiced one like home. If the run-through at the shift change was almost exactly the same as the actual process for generating an alert, the chances of triggering an alert by mistake because of routinization were increased.
- A button click sends an emergency alert to the entire state. There is a mismatch here between the simplicity of the action and magnitude of the consequence. While speed is of the essence, there is a tradeoff between simplicity and outcome. There are many ways around this problem, for example just labeling the button with appropriate verbiage, including a warning icon, and using a danger color (i.e., red) might be enough. Requiring a confirmation as a second step is another approach, which was used here but was ineffective (we will see why in No. 4 below). Perhaps the most obvious fix here is to require a second person to verify the action.
- A confirmation dialog didn鈥檛 work. How many times have you seen 鈥淎re you sure? Yes, No鈥? Plenty, no doubt. How many times have you found yourself swearing after pressing 鈥淵es鈥 too hastily? This is one of the most common questions that an IT consultant asks, 鈥淲hy did you press to confirm?鈥 And the most common and frustrating answer is, 鈥淚 don鈥檛 know.鈥 Well, the reason is the same as No. 2 above: routinization. When you are on automatic, you don鈥檛 see and you don鈥檛 think, at least not consciously. Routinization will combine any commonly occurring sequence of actions into a single action without your awareness. That means that pressing the 鈥淪end Alert鈥 button (or whatever it is) and pressing the 鈥淵es鈥 to confirm button are actually not two actions in the human mind, but really just one action. After happening together a few times, the two button presses are programmed into a single movement by the brain, and this movement cannot even be stopped once it starts. Hence the befuddlement afterwards, 鈥淚 don鈥檛 know why I did that.鈥
- There is no standardized false alarm mechanism. Everyone makes mistakes. Error recovery is just as important in system design as any other function. This means that the designers must anticipate errors and provide ways to recover. The human-centered design literature is full of advice on how to do this. First, anticipate the errors by using what is known about human behavior. The four issues just discussed give you signposts about where errors are likely. Second, design to impede or block errors. If the alert button were a physical button, there might be a flip-open door blocking it. What might the screen version of this blocker be? Third, design to recover from errors. This means two things: tell the person what happened and explain exactly how to recover. In this event, the person who made the error apparently didn鈥檛 know what they did until they actually received the emergency alert on their own phone. There needs to be immediate feedback that says something like 鈥淎n emergency alert has been sent statewide,鈥 and this message should not appear in any other circumstance, including simulations or training. After that, there needs to be a very fast recovery mechanism, with instructions, like 鈥淧ress this button to send a retraction.鈥 In the actual event, the lack of a standard recovery procedure meant that it took almost 30 minutes to send a text explaining that there was a false alarm.
These are all design guidelines that the HCI expert should have on hand to help during development. However, not everything can be foreseen, and so there are other important HCI processes that need to be carried out when developing systems such as these.
Scenarios
If we focus on the button, the screen, and the moment, then we miss the larger picture. This event happened in the course of an activity with many steps, and at each step there were branches for doing something else. In聽-based design, all of the possible things that a person might do while interacting with a computer system to accomplish a range of tasks are explored. This includes all of the possible things that a person might do wrong, and all of the paths of recovery. Note that this is not computer programming. This is not coding. This is thinking about what people do, which is the first principle of human-centered design.
Rapid Prototyping
Scenarios can be tested with very basic versions of software, called聽. Prototypes can be very simple, even based on paper or cards, and are designed to be modified easily and thrown away when they don鈥檛 work. Prototype testing should be done during the design process, using real people who know the tasks (not the programmers or designers) and they should include all activities, including errors. Although this is also not programming per se, programmers and software engineers should be involved in prototyping since it will guide their implementation process ultimately.
HCI To The Rescue
Again, I do understand that it is always easy to criticize in hindsight. But, when I hear that someone 鈥減ushed the wrong button,鈥 I cannot let that stand as an acceptable reason to scare the wits out of a million and a half residents and a quarter of a million visitors to our state. I don鈥檛 blame the person who pushed the button, but I do question the designers and developers of a system in which such a thing could happen.
Error recovery is just as important in system design as any other function.
Human-centered design and HCI are often not taken seriously because they add time and cost, they have roots in the social and behavioral sciences which can be antithetical to STEM practitioners, and they require interactions between cultures that often find it hard to understand each other.聽It is not an uncommon practice to release apps and even larger and more complex programs in beta versions with the idea that problems will be discovered in the field and fixed in upgrades. But upgrade cycles based on widespread failures in the field are not good practice, and iterative fixes are very different from good design. In application contexts as critical as statewide emergency alerts, poor design is completely unacceptable.
So, my message to the poor governor and his staff is: By all means proceed with fixes and procedural work-arounds to keep this from happening again. But I would also let loose on the emergency alert system some good interaction designers who are knowledgeable about humans and familiar with human-centered principles. They will come to your rescue before you need it.
GET IN-DEPTH REPORTING ON HAWAII鈥橲 BIGGEST ISSUES
Community Voices aims to encourage broad discussion on many topics of community interest. It鈥檚 kind of a cross between Letters to the Editor and op-eds. This is your space to talk about important issues or interesting people who are making a difference in our world. Column lengths should be no more than 800 words and we need a current photo of the author and a bio. We welcome video commentary and other multimedia formats. Send to news@civilbeat.org.聽The opinions and information expressed in Community Voices are solely those of the authors and not Civil Beat.
Support Independent, Unbiased News
Civil Beat is a nonprofit, reader-supported newsroom based in 贬补飞补颈驶颈. When you give, your donation is combined with gifts from thousands of your fellow readers, and together you help power the strongest team of investigative journalists in the state.