By Jonathan Brown, Digital Architect, SDLC Partners

Every time we adopt a new technology, we also adopt all of the work to keep that technology running.

I am a perpetual optimist when it comes to new tech. I always forget to factor the “keeping it running” work into my vague mental calculations of how new tech will improve my life. When I decided to buy smart home lightbulbs, for example, I didn’t consider how frequently I would walk into a dark room that I could not light, because the Internet was down.

Even the cloud applications we adopt cause us to do work to support them. The promise was that by being out in the cloud, we would need to do less application management. Also, it is not so. My Mint account, for example, continuously loses connectivity to one of my banks, and I need to log in and reset passwords to configure things.

We call this extra work excise because it puts a tax on our goal-directed work. The problem with excise is that the effort we expend doesn’t go directly towards accomplishing our goals. It’s taxing.

Reducing Excise and Ensuring Technology Serves Us

Where we can eliminate the excise effort, we make people more effective, productive and happier.

It starts with understanding users and observing, engaging them. End users know better than anyone what is annoying them about the software they use every day. So, it seems obvious to start by talking to them. But, to truly see through the users’ lens is not merely to ask the people what their problems are.

Henry Ford famously said that if he’d asked consumers what they wanted, they would have told him a faster horse, not a Model T. We must take time to understand what they are really asking for and what goals underlie those requests.

Drawing on years of design research in practice, I believe the combination of observation and one-on-one interviews is our most effective tool for gathering qualitative data about users and their goals. We call this method ethnographic interviewing, borrowing from the field of anthropology. It means the systematic and immersive study of human cultures. Ethnographic interviews take the spirit of this type of research and apply it on a micro-level.

What’s critical to understand is that what people say, what people do, and what they say they do are entirely different things.

The only way to determine how “what people say” is different from “what people do” is to observe them. Our technique builds upon the master-apprentice model of learning: observing and asking questions of the user as if they are the master craftsman, and the interviewer the new apprentice. And it doesn’t require observing and interviewing dozens of users to yield invaluable input. By observing fi ve of each type of user, we expose about 80 percent of usability issues.

Donald Norman, the author of Living with Complexity, said, “We must design for the way people behave, not for how we would wish them to behave.” When we understand and translate that into our technology, we can build the best software that serves us.

When We Do It Right

When technology does an excellent job of serving our human needs, the computer does the work, and the person does the thinking. Take, for instance, PathAI’s computer vision model that has improved breast cancer biopsy accuracy from 85% to 99.5%. Their technology assists physicians in their work to review an average of 50 slides per patient, each containing hundreds of thousands of cells.

PathAI can detect abnormalities in just minutes and present them for the pathologist to review. Pathologists then apply their experience, and holistic knowledge of the patient, to assess each case.

When We Do It Wrong

Recently, I talked to my doctor about his electronic health record (EHR). He said, “when they installed my EHR, I thought it was going to be a great tool for me to use…now I realize I am a tool for it to use.”

According to a 2017 Annals of Family Medicine study, doctors are spending about five hours a day with their patients, but almost six hours a day on their EHRs. While they are sitting with the patient, doctors have to cognitively switch between focusing on the record and focusing on the patient. The process is not unlike texting and driving, and it invites errors. An interview with one emergency room (ER) physician noted that the average ER doc will make 4,000 mouse clicks over the course of a shift, saying, “…the odds of doing anything 4,000 times without an error are small.”

Humans err continually; it is an intrinsic part of our nature. System design should take this into account.

When It Goes Horribly Wrong

When we don’t account for user errors, those errors can be deadly.

In 2015, a young lawyer had been suffering from severe headaches for two days, and a disorienting fever left him struggling to tell the 911 operator his address.

Suspecting meningitis, a physician at the hospital performed a spinal tap, and the next day an infectious disease specialist typed in an order for a critical lab test — a check of the spinal fl uid for viruses, including herpes simplex — into the hospital’s EHR.

The hospital had a multimillion-dollar system, considered by some to be the Cadillac of medical software. The order created by the physician appeared on the EHR screen, but it was not sent to the lab. It turned out the software didn’t fully “interface” with the lab’s software. The patient’s results and diagnosis were delayed — by days — during which time he suffered irreversible brain damage from herpes encephalitis.

The software company denied any liability or defects in its software; the company said the doctor failed to push the right button to send the order.

Blaming the Humans

“The idea that a person is at fault when something goes wrong is deeply entrenched in society. That’s why we blame others and even ourselves. Unfortunately, the idea that a person is at fault is embedded in the legal system.” Donald A. Norman, The Design of Everyday Things

In the previous example, the software company quietly paid $1 million to settle the suit in July 2018, while the hospital and two doctors paid a total of $7.5 million, and a case against a third doctor is pending trial.

Our technology is costing our users dearly. Worse, blaming the person does not fix the problem: the same error is likely to be repeated by someone else. As technologists, our goal must be software that avoids grave consequences by making user error impossible, or at the very least, reversible. Every additional click we demand of our users makes those errors more likely, so we must work diligently to remove UI excise.