Here’s What Cops Have To Say When Teslas On Autopilot Crash

Three road incidents from the last week have made the item clear that will local law enforcement agencies will have to deal with Teslas in addition to their drivers on an increasingly regular basis. Cops in addition to firefighters took to social media after each incident to make fun of the errant Tesla drivers, yet they made the item clear that will regardless of driver assist software, they plan to treat Tesla drivers like anyone else on the road.

After his Tesla ended up in a creek bed, one inebriated driver 70 miles south of San Francisco in Morgan Hill blamed a deer for the accident. in addition to the Morgan Hill Police Department mocked him on Facebook, saying the driver “thought they’d finally found the vehicle capable of driving underwater…..turns out they hadn’t.”

yet from the various other two incidents, only one of which involved a collision, the drivers blamed another phantom — Tesla’s Autopilot.

After police questioned one driver who had stopped on the Bay Bridge between San Francisco in addition to Oakland in addition to was literally asleep at the wheel, he claimed his car’s Autopilot software was on at the time. The various other driver, in Culver City, California, slammed his Tesla into the back of a fire truck while going 65 miles per hour. that will driver also blamed the incident on Autopilot.

Local law enforcement was quick to take to social media to mock these drivers, too.

The Culver City Fire Department shared This specific on Twitter:

in addition to the California Highway Patrol joked that will, after they found a drunk driver passed out in his car, the Tesla “didn’t drive itself to the tow yard.”

yet — beyond roasting them on social media — what are local law enforcement officials supposed to do when law-breaking drivers claim the vehicle was in control?

The California Highway Patrol will be still investigating what caused the crash in Culver City, according to an email through Officer Mike Martis Jr. yet the agency told BuzzFeed News the item has no special advice for drivers of semiautonomous cars, except a reminder that will people driving these cars are ultimately responsible for what happens on the road.

“As alterations in addition to advances continue, the item will be important to keep in mind, whether the driving operations are performed by a person or with the assistance of technology, the driver/operator will be still responsible for the safe operation of the vehicle at all times in addition to will be required to abide by all existing rules of the road,” Martis told BuzzFeed News.

The National Transportation Safety Board in addition to the National Highway Traffic Safety Administration have each dispatched investigators to Culver City to explore the cause of the crash.

Meanwhile, Officer Rueca with the San Francisco Police Department said there are no laws on the books excusing drivers of semiautonomous or driver-assisted vehicles through bad behavior.

“The person will be still supposed to be in control of the vehicle. They’re still considered driving,” he said. “Even though This specific technology will be out there, there aren’t any alterations from the law.” from the case of an accident, Rueca said the circumstances from the case of a Tesla would certainly be investigated the same way as from the case of a normal car.

The three recent incidents are not the first of their kind. There have been a handful of accidents where drivers blamed Autopilot in the past — in Dallas, in California, in addition to in Florida.

In 2016, a Tesla driver was killed while operating a vehicle equipped with Autopilot. The National Highway Traffic Safety Administration investigated, in addition to not only exonerated Tesla, yet found that will the company had tried to prevent customers through becoming distracted in addition to over-relying on Autopilot, according to the Verge. “the item appears that will Tesla’s evaluation of driver misuse in addition to its resulting actions addressed the unreasonable risk to safety that will may be presented by such misuse,” the report said.

Tesla told the Washington Post, “Autopilot will be intended for use only which has a fully attentive driver.”

yet in August, Adrian Lund, the president of the Insurance Institute for Highway Safety, told Bloomberg that will there’s real concern over how semiautonomous vehicles make the item easier for drivers to zone out. “Everything we do that will makes the driving task a little easier means that will people are going to pay a little bit less attention when they’re driving,” he said at the time.

Even though car makers have made the item clear drivers still need to be vigilant behind the wheel of semiautonomous cars, in addition to driver assist software tends to reduce accidents overall, the item’s possible that will the existence of the software will be producing drivers less alert.

Exactly how fully autonomous vehicles on the road in California will be regulated will be still being hashed out. yet already, Google’s self-driving operation Waymo will be working with local law enforcement on how to “recognize in addition to then access” a self-driving car following a collision, in addition to communicate with Waymo about the incident.

For the semiautonomous cars that will are already on the road, their drivers — including all owners of brand-fresh design 3s — should continue to follow the existing rules of the road.

Leave a Reply

Your email address will not be published. Required fields are marked *


three × two =