UX Innovation > In-Vehicle UX Blog

Unlearned Lessons from the Uber Crash Report

by Derek Viita | Jun 25, 2018

Late last week, local law enforcement released its report on the fatal crash involving a vehicle from Uber's Advanced Technologies Group (ATG).  The results are sad but predictable:  The operator is being blamed, and charges of vehicular manslaughter are being considered.

Uber test car with bicyclist

Source:  FutureCar.com

Unsurprisingly, as autonomous fleet testing remains a consequence-free business, Uber is soldiering on.  A media report indicates that Uber intends to resume experiments in at least one market as early as this week.  And why wouldn’t they?  Uber needs a commercial fleet of robo-taxis to achieve profitability.  Several of Uber’s competitors are working toward automated transport systems (with Waymo arguably being furthest along), but Uber has the most to gain from such an advancement.

In its test markets, Uber is essentially conducting public-road experiments.  The vehicles operate autonomously in most situations, and require operator input only in specific situations.  In an open ecosystem with live and unpredictable obstacles, automated systems must bring the test operator into the loop and give full situational awareness quickly and intuitively.

The benefit of these highly advanced autonomous systems (whether in on-market vehicles or test cars) is that they allow the operator to engage in non-driving-related behavior.  The drawback of these systems is that they implicitly encourage the operator to engage in non-driving-related behavior.

When presented the opportunity, operators of systems like these will almost certainly engage in non-driving-related behavior.   At the moment a takeover is required, the impetus is on the system (not the operator) to bring the operator back into the loop.  And reaction times tend to be very large.

We already know this fact from six decades of human factors research:  Automation must be designed around the user.  No amount of operator training can fix these issues.  Only design can.  In this regard, Uber has failed.  And unless the test operator interface is completely redesigned, they are destined to fail again.

Local, state, and federal governments hold partial blame here as well.  Standards for on-road testing and practices for when things go awry are still works-in-progress.  And assumptions for cause are based on human error, discounting design and system error entirely.  Compare Uber’s crash in Tempe to the most recent fatal incident involving a passenger airline in the US.  Despite the core cause of the incident being a clear known, multiple national agencies are still investigating, the airliner is still investigating, multiple passengers have filed a lawsuit, and the ultimate outcome will most assuredly involve some mandated change in process and/or design.

Uber says they plan to share more information about program changes “soon.”  But information shared with local governments and other entities so far has been irregular and unprofessional, and federal agencies are largely taking a hands-off approach.  This proves that Uber and the public sector have learned nothing of substance from this fiasco.

Any movement from both Uber and all governments must be carefully considered, as these changes will have consequences for rollouts of future systems. In year-over-year trends, Strategy Analytics finds that consumer interest in self-driving features tends to decrease after crashes involving autonomous systems.  Most notable was the first fatal crash involving a Tesla Model S with Autopilot, which led to a dip in consumer interest and trust.  If the industry remains focused on mileage collection (rather than design and use cases), and the government remains oblivious to safety issues, will consumers even want these systems when they are supposedly ready for market?

This is where Uber is in a unique position to contribute.  If Uber would truly like to be viewed as a leader in this space, there are two things they must do:

One:  Announce their intention to suspend Uber ATG on-road tests indefinitely, end all partnerships, and return to the drawing board.  Until it can be proven that Uber’s test cockpit has been rigorously designed to keep test operators and the public safe, they must promise to stay off the road.

So far the status quo for tests from Uber and its competition has involved retrofitting existing HMI, and deactivating certain ADAS features.  Uber can create massive goodwill by admitting that this is terrible practice (perhaps even name-checking certain competitors who are doing the same thing), and insisting on designing a new system with HMI built around the operator.

Two:  Start a coalition on HMI guidelines for autonomous test vehicles with competitors and government agencies.

The government’s hands-off approach will certainly be a good thing for self-driving technology… until it is not.  The only thing separating the industry from bureaucratic red tape, Congressional hearings, and an effective shutdown is one high-profile incident.  Imagine the public backlash after a test vehicle crashes into a group of young children or senior citizens.  Government at all levels would be compelled to respond, and the resulting heavy-handed processes would likely impede development for decades.

Uber can head this off by offering to form a public-private coalition on cockpit design for autonomous transport.  There is some precedent here:  The US government’s effort to establish guidelines for driver distraction was largely borne out of an early industry-driven effort from the Alliance of Automobile Manufacturers.  Uber could set the tone for future development, be viewed as worthy of consumer trust, and generally be seen as a “good neighbor.”

There is a story to be told which could have a happy ending for Uber’s autonomous ambitions.  But rushing back into its experiments without changes to its design strategy is not that story.  Instead, it is a story that could be pulled straight from Steven Casey’s classic collection of design mishaps “Set Phasers on Stun.”  One life has been lost, and another life ruined, due to poor design attributed as human error.  All while Uber, the entire industry, and all levels of government are poised to learn nothing.  Assuring that this story will be repeated many, many more times.

Previous Post: Waymo Autonomous Ride-Hailing: Dreams Realized, but Questions Remain
Leave a comment