Published January 18, 2018 This content is archived.
The false missile alert that sent Hawaii running for cover last Saturday is the result of a poorly designed alert system, not employee negligence.
That’s according to Ann Bisantz, professor and chair of the Department of Industrial and Systems Engineering. Bisantz studies human factors engineering, which focuses on how people interact with machines, systems and work environments.
“In the end, it is not the employee’s fault. It should never have been that easy to make the wrong choice,” says Bisantz. “This is interface design 101 — you do not put the test option next to the live option, nor make the activation methods so similar. Instead, there should be interlocks within the system to prevent a false alert from being sent.”
Since the scare, Hawaiian officials have made a number of changes to the system and their procedures, including a requirement that two people (instead of one) send out test and real alerts, as well as installing a cancellation command that can be triggered within seconds of the initial alert.
Bisantz says these are common-sense improvements that should help prevent a similar occurrence. However, she says Saturday’s false alert illustrates the need to further integrate human factors engineering into the workplace.
“As we become more and more reliant on systems, procedures and machines, we must take the time to consider how humans interact them. Otherwise, we risk more events like what happened in Hawaii,” she says.