About a 4 minute read.
Image by Denys Nevozhai on Unsplash
Anyone who’s been a software developer for awhile should be aware of the 80/20 rule (also called the Pareto principle).
With respect to software development, it generally means, you can get 80% of the desired features with 20% of the total effort, and the last 20% of features take 80% of the total effort. It’s particularly relevant for software that is all-or-nothing.
With app development, we tend to think of the minimum viable product (or MVP) – the set of the smallest number of features that make a usable product. But for more complex solutions, the MVP is really the whole product. Thus all-or-nothing.
Are We There Yet?
And that’s the case with self-driving cars. We can’t have autonomous vehicles that don’t have a full set of features. It’s not okay to have a car AI that knows what to do 80% of the time, but 20% of the time gets confused and occasionally runs over children. Imagine the horror of sitting in your AI car and watching helplessly while it mows down some pedestrians.
I was inspired to think about the 80/20 rule with respect to self-driving technology the other day while I was sitting at a congested intersection and a fire truck came roaring through, going into the oncoming lane to bypass all the waiting cars. It’s hard enough for actual humans to figure out a course of action when emergency vehicles are near. I’m sure my brother, a career police officer, could tell some tales about that. But how much harder will it be for someone to make an AI that can take the correct action?
Even in situations where an emergency vehicle is approaching from behind may be tricky. Will the car know that a fire truck needs to pass it and it should pull to the side? Sometimes, I have even proceeded through a red light with a fire truck behind me, to let it get past. Frequently, drivers pull partly off the road. There are so many unique circumstances involving first responders.
Meanwhile, some cars with limited self driving capability are plowing into lane dividers on the much simpler use case of normal highway driving.
Every bone of my being, informed by decades of software development experience, tells me we are much farther away from fully autonomous vehicles than the proponents and (most of) the pundits would have us believe.
Why? (The Cynical Answer)
So, why are we being led to believe that self-driving cars are just around the corner (no pun intended)?
The cynics among us must at least consider that there is a motive to make us (or at least those of us with money to invest) believe the technology is further along in development than it is.
For example, Uber needs the technology for its log term survival. They have been fighting court battles to be able to keep treating their drivers badly – trying not to provide benefits to them, winning some battles, and losing others. And their business model relies on the drivers’ poor math skills. If the drivers figure out that most of them are actually losing money by driving (when you factor in all the costs like wear and tear), Uber will have even more trouble recruiting them.
Uber lost almost one billion dollars in 2018. Their only hope of survival is to convince investors that profits are coming soon, and the only way to get profits is to get rid of the drivers. So, Uber has a huge incentive to push the idea that the technology is almost where it needs to be. But, it is clearly not.
Why? (The Not So Cynical Answer)
If you’re not a cynic, then you could place the blame on programmers. As a group, we tend to be overly optimistic about how good our software is, or how much we can get done in a given timeframe. I like to think I’m among the group who have enough experience to temper our expectations and come up with more realistic appraisals. But that wisdom has come from a long career. Sexy startups tend to hire the hot youngsters out of college and turn them loose. Not that there’s anything wrong with that – as long as you have some wiser employees who can provide a more reasonable assessment of where the project actually stands.
Which Is It?
I actually think that both of the above scenarios play a role in the unrealistic expectations of self-driving technology. Perhaps the programmers at Uber are telling the management exactly what they want to hear, and the managers therefore don’t question it. And perhaps it’s happening with the other companies as well.