Tesla crash on Pennsylvania Turnpike draws attention to driving technology
July 7, 2016 10:38 AM
A Model S Tesla in Autopilot mode. The nonfatal crash on the Pennsylvania Turnpike occurred last Friday.
Joshua Brown, formerly of Murrysville, in a self-driving Tesla vehicle last year.
By Daniel Moore / Pittsburgh Post-Gazette
Days after the first known fatality involving a self-driving vehicle emerged, a second crash — this time on the Pennsylvania Turnpike — has put more pressure on researchers and U.S. auto-safety regulators to keep roads safe while promoting a technology some say could save lives.
The second crash, in which a Tesla Motors sport utility vehicle on July 1 rolled over after slamming barriers on both sides of the highway about 100 miles east of Pittsburgh in Bedford County, also raised questions about the responsibilities of the driver in vehicles with autonomous driving features. Two people were injured in the incident.
The driver reportedly told Pennsylvania State Police his Tesla Model X was operating in Autopilot, a self-driving feature Tesla rolled out last fall that temporarily takes control of the vehicle. The National Highway Traffic Safety Administration confirmed it is investigating the accident, but could not comment on its findings so far.
The Palo Alto, Calif.-based electric-vehicle maker said in a statement that it had “no reason to believe that Autopilot had anything to do with this accident” based on initial crash data. Beyond a press release with the crash details, state police would not confirm whether the driverless feature had been engaged at the time of the accident.
The Bedford County accident occurred just days after federal authorities launched an investigation of a May 7 crash, also involving a Tesla vehicle, on a highway in Florida. In that crash, Tesla said in a blog post the car was in self-driving mode. It said both the system and the driver failed to apply the brakes before colliding with a truck turning in front of it.
The incidents show the current limits of autonomous vehicle technology and how drivers still play a key role in controlling the car. It also reveals the challenges that federal authorities face as they attempt to strike a balance between encouraging innovation in auto safety and keeping roads safe from distracted drivers who may be lulled into a false sense of security.
Driverless vehicle technology has asserted itself as the panacea for traffic crashes, which are overwhelmingly caused by human error. Earlier this month, NHTSA released preliminary data showing an 8 percent increase in motor vehicle traffic deaths in 2015. An estimated 35,200 people died in 2015, up from the 32,675 reported fatalities in 2014. NHTSA blamed about 94 percent of those traffic crashes on human error.
But currently, not all driverless technology is fully driverless.
Tesla, which introduced its Model X electric car in 2015 with a base price of $80,000, describes its Autopilot mode as semi-autonomous. Owners can choose to activate it as “an assist feature that requires you to keep your hands on the steering wheel at all time,” the company emphasized in the blog post after the fatal crash in Florida.
By contrast, self-driving technology that has been under development at Carnegie Mellon University for decades and, for the last year or so, at Uber’s Advanced Technologies Center in the Strip District, wants to make vehicles entirely autonomous.
“Over-reliance creates more risks in using this technology,” said David L. Strickland, a former NHTSA administrator who is leading a newly formed industry group of companies developing fully autonomous vehicle technology. Led by five companies — Google, Uber, Lyft, Ford and Volvo — the Self-Driving Coalition for Safer Streets is pushing for favorable rules ahead of NHTSA’s updated guidances on self-driving cars expected to come out this summer.
“Drivers’ use and misuse of the technology is something that’s very important, which is why full self-driving is seen as the technology solution,” Mr. Strickland said.
In other words, autonomous vehicle developers have placed blame on the driver’s mistakes — human error that CMU and Uber are seeking to eliminate.
“My member companies and every automaker that’s working on full self-driving technology is absolutely, positively working hard to ensure that when this technology is placed in the hands of consumers, that it is going to operate at the highest level of safety,” Mr. Strickland said.
Still, information sharing is at an early stage, and developers are feverishly scaling up their versions of the technology. Uber researchers can be seen regularly testing vehicles in the North Shore and Downtown, and CMU researchers showed off its own car last month.
John M. Dolan, principal systems scientist at CMU’s Robotics Institute, said even if driverless vehicle data was submitted for analysis, it would be difficult to know how the vehicles performed.
“If I drive 1,000 miles with Tesla Autopilot and carefully oversee the driving as I’m supposed to, and if I intervene several times during that stretch in order to prevent an accident or unsafe driving, are those interventions recorded somehow?” he said, giving a hypothetical example. “And even if they are, is there enough information on the circumstances surrounding the intervention to allow me to understand what improvements have to be made?”
“Without standardized platforms,” he added, “it's not always easy to directly apply lessons learned in one context to a different context.”
A news release issued by the state police in Everett, Pa., stated the Tesla Model X in the crash last week was traveling east on the turnpike when it collided with a guardrail on the right side of the road, then a concrete barrier on the left.
It rolled over and came to a stop in the middle of the road.
The driver and a passenger were injured and taken to UPMC Bedford for treatment. The hospital would not release information about their condition.
A second car nearby was struck by debris. The driver and a passenger in that car were not injured.
Daniel Moore: firstname.lastname@example.org, 412-263-2743 and Twitter @PGdanielmoore.
To report inappropriate comments, abuse and/or repeat offenders, please send an email to
email@example.com and include a link to the article and a copy of the comment. Your report will be reviewed in a timely manner.