A Cars forum. AutoBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » AutoBanter forum » Auto newsgroups » Driving
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Ghost in Musk's machines: Software bugs' autonomous joy ride



 
 
Thread Tools Display Modes
  #1  
Old October 12th 17, 09:02 PM posted to alt.politics.obama,rec.autos.driving,alt.autos.toyota,sac.politics,alt.society.liberalism
David Fritz
external usenet poster
 
Posts: 3
Default Ghost in Musk's machines: Software bugs' autonomous joy ride

Last year, a dark historical landmark was reached. Joshua Brown became the
first confirmed person to die in a crash where the car was, at least in
part, driving itself. On a Florida highway, his Tesla Model S ploughed
underneath a white truck trailer that was straddling the road, devastating
the top half of the car.

Brown’s crash is well known. But more mundane bugs are finding their way
into increasingly software-dependent, semi-autonomous cars. Software
problems accounted for nearly 15 per cent of US car recalls in 2015, up
from less than five per cent in 2011, according to the most recent report
from financial advisors Stout Risius Ross.

Last year, to name a few examples, Toyota recalled around 320,000 cars
after they found “improper programming” could cause airbags and seatbelt
pretensioners to activate unbidden. Ford had to recall 23,000 cars because
software problems in their electronic windows meant they had excessive
“closing force.”

Despite the latest wave of excitement about artificial intelligence, the
fear among some of those in the industry is that bugs could prove a
serious hurdle to mass adoption – not least because of the weird,
unexpected nature of the accidents they can cause.

Philip Koopman, an associate professor at Carnegie Mellon University and
an expert on autonomous vehicle safety, told The Reg: “I look at the
errors, and almost always say: ‘Wow, that should not have happened.’ And
the most likely explanation is that they did not follow a safety
standard.”

The “continuous stream” of defects in the car industry signals “underlying
problems: they just don’t want to spend the time and effort to get it
right,” he argues.

Car manufacturers contacted by The Reg were unwilling to talk.

Significantly, many developing autonomous vehicles are hiring developers
from Silicon Valley whose backgrounds are in general purpose software –
software that, of course, crashes with reasonable frequency. People are
not hiring from among the ranks of the airline safety industry.

“Knowing how to code is not knowing how to be safe,” Koopman says.

Allegations of poor code go back years. Koopman was an expert witness for
plaintiffs in a 2013 court case in Oklahoma that looked into whether
computer problems had caused a Toyota Camry to accelerate uncontrollably
and crash, killing a passenger in 2007.

He and another investigator found Toyota’s electronic throttle control
system was “just awful.” An 18-month investigation found numerous problems
in the software [PDF], including the potential for stack overflow and no
protection against bit flips – where ambient radiation in the outside
environment can switch a bit. The report concluded Toyota’s code was
“spaghetti.”

The jury decided the electronics had been at fault and awarded $3m in
compensation. Toyota stands by the safety of its throttle system, a
company spokesman said, pointing out that an earlier official
investigation, partly carried out by NASA, did not find any faults.

Yoav Hollander, founder of Foretellix, a company looking to develop new
ways to find bugs in engineered systems, has for a number of years been
attending conferences about verifying the safety of autonomous cars (and
other autonomous systems). He was not impressed by progress initially,
although thinks the situation is now “improving.”

One of the issues, Hollander says, is that companies are overly focused on
preventing what he calls “expected bugs” – where engineers anticipate a
problem. This might include making sure that the car cameras can correctly
identify a pedestrian wearing a black coat at night.

But then there are unexpected bugs – problems that no one has thought of
or situations that have been overlooked. A car designed in the US but
driven in the UK could set off on the right hand side of the road simply
because its default location is the US – all because a developer forgot to
include an instruction to check location after a memory reset.

These kinds of “autonomous vehicle only” bugs – mistakes that no human
would ever make – will be big news, Hollander says. “People will say:
‘Hey, I’m at the mercy of the vehicle’.”

The Joshua Brown crash – driving at full speed into a clearly visible
trailer – is arguably one such example as it “would never happen to a
human being,” Hollander says.

After the Florida accident, Tesla reportedly wasn’t immediately sure why
its autopilot system hadn’t braked. They probed the possibility that the
system deliberately ignored the trailer to avoid braking when approaching
objects like overhead bridges. This was an idea supported by an
investigation [PDF] by the National Highway Traffic Safety Administration.

A Reg request for clarification from Tesla went unanswered.

https://www.theregister.co.uk/2017/1...mous_vehicles/
Ads
  #2  
Old October 13th 17, 01:28 AM posted to alt.autos.toyota,rec.autos.driving
Your Name[_2_]
external usenet poster
 
Posts: 130
Default Ghost in Musk's machines: Software bugs' autonomous joy ride

On 2017-10-12 20:02:32 +0000, David Fritz said:

> Last year, a dark historical landmark was reached. Joshua Brown became the
> first confirmed person to die in a crash where the car was, at least in
> part, driving itself. On a Florida highway, his Tesla Model S ploughed
> underneath a white truck trailer that was straddling the road, devastating
> the top half of the car.
>
> Brown’s crash is well known.

<snip>

As is the fact that he was to blame, not the Tesla car. No Tesla car is
yet meant to be completely self-driving. He ignored the instructions
and paid the price for it ... as the old saying goes: Read The F'ing
Manual!

 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Autonomous Vehicles Electric Comet Technology 9 March 16th 15 11:19 PM
BMW reveals new self-parking autonomous technology [email protected] BMW 1 December 15th 14 05:28 PM
Frame machines, still used? Ivan Vegvary[_2_] Technology 2 November 2nd 13 05:31 PM
Silver Ghost, Phantom I, Ghost Conscious Pilate[_2_] Auto Photos 0 November 7th 12 10:16 AM
Leg Squat Machines [email protected] Ford Explorer 0 May 21st 09 05:06 AM


All times are GMT +1. The time now is 05:07 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 AutoBanter.
The comments are property of their posters.