lunes, 11 de julio de 2016

MercatorNet: Self-Driving Car Fatality No. 1: Joshua Brown Makes History

MercatorNet: Self-Driving Car Fatality No. 1: Joshua Brown Makes History



Self-Driving Car Fatality No. 1: Joshua Brown Makes History

Tesla has some work to do.
Karl D. Stephan | Jul 11 2016 | comment 
    






On May 7 of this year, Joshua Brown, owner of a wireless-network technology company and Tesla car enthusiast, was riding in his Tesla Model S on a divided highway in Florida. Mr Brown loved his car and posted numerous YouTube videos that showed him using the autopilot function in the "look, Ma, no hands!" mode.

By all accounts, Brown was a generous, enthusiastic risk-taker (his specialty when he was in the military was disarming weapons, according to a New York Times report), and hands-free driving went against the explicit instructions Tesla provides for the autopilot feature.  But Tesla owners do it all the time, apparently, and until May 7, Mr Brown had gotten away with it.

Then a tractor-trailer rig made a left turn in front of Mr Brown's Tesla.  According to a statement by Tesla, the high ground clearance of the trailer and its light colour, resulting in low visual contrast against the sky, failed to trigger the car's brakes.  The Tesla ran underneath the trailer, fatally injuring Mr Brown.

A neighbour quoted Mr Brown afterwards as saying in another context a few weeks before the accident, "For something to catch Elon Musk’s eye, I can die and go to heaven now."  No one knows how serious Mr Brown was when he said that.  But he will go down in history as the first person in the U. S., and perhaps in the world, to die in a car that was operating in its self-driving mode.

Will this tragedy spell doom for self-driving cars?  Almost certainly not.  The first recorded steam-locomotive railway fatality was that of the English politician William Huskisson, who attended the opening ceremonies of the Liverpool and Manchester Railway on Sept. 15, 1830, which featured inventor George Stephenson's locomotive the Rocket.

Wanting to shake the hand of his former political enemy the Duke of Wellington, Huskisson walked over to the Duke's railway carriage, then saw that the Rocket was bearing down on him on a parallel track. He panicked, tried to climb onto the carriage, and fell back onto the track, where the locomotive ran over his leg and caused injuries that were ultimately fatal.  Passengers had been warned to stay inside the train, but many paid no attention.

If Huskisson's death had been mysterious and incomprehensible, it might have led to a wider fear of railways in general.  But everyone who learned of it took away the useful lesson that hanging around in front of oncoming steam locomotives wasn't a good idea, and railways became an essential feature of modern life.  Nevertheless, every accident can teach engineers and the rest of us useful lessons in how to prevent the next one, and the same is true in Mr Brown's sad case.

It's not clear how long the Version 7.0 of the Model S software featuring the autopilot function has been available, but it's probably been out for at least a year.  Multiply that time by the number of Model S owners and how far they drive, and you have a track record that shows if anything much is wrong with the software, it's not very wrong.  Model S owners aren't dying like flies in autopilot accidents.  Still, telling drivers how great a self-driving feature is, and then expecting them to pay constant attention as though the car were a driver's ed student and you were the instructor, is sending a mixed message.

Tesla's own posting about the accident cites statistics that show if anything, Model S cars have a lower accident rate than average, and that may be true.  But as Tesla's public profile rises, the firm has some delicate manoeuvring ahead of it to avoid becoming a target for lawyers who will want to portray Tesla in court as heedless of driver safety.

We've known since the earliest days of automobiles that they are dangerous in careless hands and require constant vigilance on the part of the operator.  Plenty of people ignore that fact and pay for it with injuries or their lives, and take the lives of others as well.  But everybody, whether safe or careless, still admits it's a good idea to pay attention while you're driving.

Now, however, something fundamentally new has been added. When a car has a self-driving feature that nevertheless requires you to be ready to take command at a moment's notice, the driver is torn between letting the machine take over and keeping a constant lookout for trouble.  You can't both be constantly vigilant and also watch a Harry Potter movie, as Mr Brown may have been doing at the time of the accident.

In most of us, especially guys, attention is a focused thing that has to be directed at one primary target at a time.   Even if I had a self-driving car (which I don't), and after driving it for a while and learning what it typically can and can't do, I wouldn't feel very comfortable just sitting there and waiting for something awful to happen, and then having to spring into action once I decided that the car wasn't doing the right thing.

That's a big change of operating modes to ask a person to do, especially if you've been lulled into a total trust of the software by many miles of watching it perform well.  Who wouldn't be tempted to watch a movie, or read the paper, or even sleep?

I'm afraid we've got some institutionalized hypocrisy here that most auto companies are fortunately free of.  But Tesla is a different kind of beast, founded at a time when anybody who ever installs software is either forced to lie, or actually has to read dozens of pages of legal gobbledegook before clicking the "I Agree" button.

The impression I have of the arrangement between Tesla and Model S owners is that Tesla pretends that they have to keep their hands on the wheel, and the owners pretend that they're following instructions.  And the pretence has made the lawyers happy, I suppose—until now.

Now that the much-anticipated First Fatality has happened, things could go in any of several directions.  The National Highway Transportation Safety Administration, which is investigating the accident, could come out with a bunch of heavy-handed federal regulations that could squash or set back autonomous vehicles in the U. S. for many years. 

Joshua Brown's relatives could mount a lawsuit that could cripple Tesla.  Or (and this is the one I'm hoping for), Tesla's engineers can learn what went wrong in Mr Brown's case, fix it, and deliver clearer, more practical instructions to drivers, including some human-factors engineering that seems to be missing, about how to use the self-driving feature, so that the remaining Tesla drivers can lessen their chances of becoming Fatality No. 2.

Karl D. Stephan is a professor of electrical engineering at Texas State University in San Marcos, Texas. This article has been republished, with permission, from his blog, Engineering Ethics, which is a MercatorNet partner site. His ebook Ethical and Otherwise: Engineering In the Headlines is available in Kindle format and also in the iTunes store. 

Sources:  Many news outlets carried reports of Mr Brown's death.  Tesla's own posting concerning the incident appeared June 30 at https://www.teslamotors.com/blog/tragic-loss.  I referred to reports on Fortune's online version athttp://fortune.com/2016/07/02/fatal-tesla-crash-blind-spot/, the New York Times report on Mr Brown's background at http://www.nytimes.com/2016/07/02/business/joshua-brown-technology-enthusiast-tested-the-limits-of-his-tesla.html, the Tesla press kit on its autopilot athttps://www.teslamotors.com/presskit/autopilot, and the Wikipedia article on William Hoskisson.  Thanks to my wife for notifying me about the incident.




MercatorNet

Motherhood has made a surprising appearance in the contest for the leadership of the British Conservative Party and Prime Ministership. Andrea Leadsom, who supported "Leave" actually addressed the party's leadership conference about babies brains and the importance of the first two years of a child's life for its attachment to the mother -- and father, as Shannon Roberts notes today. She realises that there is something wrong with building the workforce around mothers with young children. And legions of mothers agree with her.
Of course she was put in the stocks by the media for talking about her children (3) butLaura Keynes says there were other issues simmering in the background. I don't know whether she is PM material or not, but I hope she has a good run


Carolyn Moynihan
Deputy Editor,
MERCATORNET

‘Do you feel like a mum in politics?’
Laura Keynes | FEATURES | 11 July 2016
UK media stir up trouble for Conservative leadership candidate Andrea Leadsom.
Read more...
 
Digital intoxication is real. Here’s why
Fabrizio Piciarelli | CONNECTING | 11 July 2016
Text claw, torticollis by text, selfie elbow...Enough already!
Read more...
 
Self-Driving Car Fatality No. 1: Joshua Brown Makes History
Karl D. Stephan | FEATURES | 11 July 2016
Tesla has some work to do.
Read more...
 
What would Abraham Lincoln say to Donald Trump about religion, politics and being a ‘Know Nothing’?
Donald Nieman | FEATURES | 11 July 2016
Make America better?
Read more...
 
Is Post-Brexit Britain set to focus on babies?
Shannon Roberts | DEMOGRAPHY IS DESTINY | 9 July 2016
Andrea Leadsom's encouraging speech.
Read more...
MERCATORNET | New Media Foundation
Suite 12A, Level 2, 5 George Street, North Strathfied NSW 2137, Australia

Designed by elleston

New Media Foundation | Suite 12A, Level 2, 5 George St | North Strathfield NSW 2137 | AUSTRALIA | +61 2 8005 8605

No hay comentarios:

Publicar un comentario