This is not a drill: The (very) real dangers of bad UX

This is not a drill: The (very) real dangers of bad UX

The recent Hawaii missile alert incident highlights the power of UX design in helping both system owners and system users.

On 13th January 2018, mobile phone users in Hawaii received the following emergency alert message: ‘Ballistic missile threat inbound to Hawaii. Seek immediate shelter. This is not a drill.’.

Fortunately, there was no missile inbound. It was inadvertently issued by a staff member of the Hawaii Emergency Management Agency’s State Warning Point office for a very simple reason – faced with an on-screen menu of choices, they clicked the wrong button, choosing to send a real alert instead of a drill.

A classic case of human error – but the system could have been designed to reduce the chances of this happening. Although a password had to be entered for confirmation that the message should be sent, both the drill and not-a-drill versions required this. Unhelpfully, both also used the same password.

As a consequence, there was nothing to inform the user that they had clicked the wrong link, and were about to send a live, state-wide alert.

What’s more, the links for both the ‘DRILL’ version of the alert and the real alert were very, very similar:

Why this happened

The system’s functionality served its core purpose of distributing messages quickly, but less thought was put into how users experienced the interface’s design.

Judging by the lack of error prevention and error correction measures in place in the system, UX doesn’t seem to have been a priority during development.

Had it been, a key first step would have been to consider all potential use cases before commencing development. Chief among these, the sending of a ‘drill’ or test message, and the sending of a real message.

By taking an end-to-end view of how the software will be used before implementing it, there would have been the opportunity to ensure error correction was fully integrated.

How this could have been prevented

The system’s development process appeared to have missed out one of the bedrocks of user experience – error prevention. Ensuring error prevention is considered and incorporated throughout the system’s design should help to significantly reduce the risk of a user making a mistake.

A system which does this effectively should lead the user through a clear, step by step process. There should also be clear, visual distinctions implemented, whether it be variation in colour shading or separated lists to easily differentiate between drills and non-drills.

We’ve outlined an example of how it might have worked better below, with 4 separate screens – not just one list – separating a user from sending an alert:

Screen 1: ‘Is this a real alert or a drill test?’

The user needs to select whether it’s a real or test alert they’re running.

Screen 2: ‘Is this an amber alert, tsunami alert, or missile alert?’

They then need to specify the type of alert. They would also have the option to provide extra information for recipients (e.g. an URL to a live update feed).

Screen 3: If the user has selected that this is a ‘real’ alert, a visual indicator (stop sign, emoji) appears to warn them. The user clicks ‘Confirm’.

The user has to select a confirmation button to proceed with a real alert.

Screen 4: ‘I want to tell everyone on the island about an impending nuclear strike’. The user clicks ‘Confirm’.

They need to select another confirmation button to ensure they wish to proceed. The drill option will have less steps as the consequences are less. After Screen 2, the user can quickly test the alert.

Screen 5: Click ‘Send’.

The system operators weren’t the only ones failed by its design. The system’s owners – the recipients of the message – were also let down. From a service design perspective, the alert’s recipients (all 1.4 million of them) are just as important. The usefulness of the alert message itself is questionable, particularly for a ballistic missile strike – providing a number to text or link for more information would be useful, and help to give the recipient a clearer understanding of what they should do next.

For example, San Diego’s tourism board uses an emergency app to warn about incoming tsunamis. But it’s matched up with other forms of communication – printed maps are posted to residential houses and businesses in the inundation zones, which are also available online. Road signs highlight hazardous zones and evacuation routes, which alerts and guides residents to safe areas.

It’s vital that the designer works directly with and for the user from an early stage to ensure a system is effective. If it doesn’t work well, the person operating it is liable to make mistakes – and those mistakes mean that it’s not just failing them, but the recipients too.

Categories
Unpublished form