top of page
Businesswoman with Mask
cloud.png

News & Resources

Stay informed with our latest news and resources from our experts

Is Automated Driving Safe Yet?

Tech News : 7th August 2024

Is Automated Driving Safe Yet?


In light of two recent reports of motorcyclists being killed in collisions with Tesla vehicles in Autopilot mode, we look closely at the issue of whether automated driving is really safe.

 

Death of Two Motorcyclists 

If automated driving is a factor in death on the roads, it seems legitimate to ask the question ‘is automated driving safe?’, especially since it’s a very new technology. In fact, there have been some widely publicised reports of deaths linked to vehicles operating on Autopilot mode in the US in recent times. For example:

featured.jpg

– In 2022, a 34-year-old motorcyclist was killed in Utah when his Harley Davidson was hit by a Tesla Model 3 on Autopilot, reportedly driving at 75-80 miles per hour. The parents of the victim have reportedly now sued Tesla and the vehicle’s driver, claiming that the driver assistant software and other safety features are “defective and inadequate.”

 

– In April last year, a 28-year-old motorcyclist from Stanwood, Washington, was struck by a 2022 Tesla Model S. It’s been reported that the driver of the Tesla told first responders that he had been looking at his phone while the car was driving itself (in Full Self-Driving mode – FSD), when the car suddenly lurched forward, hitting the motorcycle. Washington State doesn’t actually permit self-driving vehicles to operate on the roads, unless they have a testing arrangement with the Department of Licensing.

 

Other Instances 

There have been other serious recent accidents involving vehicles driving on Autopilot mode, including this month (August 2024), in the US, when a driver of a Tesla was killed when the vehicle failed to navigate a highway ramp while on Autopilot, leading to a collision. 

In fact, it’s been reported that The National Highway Traffic Safety Administration (NHTSA) in the US has identified 13 fatal crashes related to Tesla’s Autopilot. 

Not Just Tesla’s With Driver Assistance System 

Although Tesla vehicles were reportedly involved in several incidents, and Teslas have a feature known as ‘Autopilot’ (which includes Traffic-Aware Cruise Control and Autosteer), and a ‘Full Self-Driving (FSD) Package’, it is not the only brand of vehicle with advanced driver-assistance systems. 

Other examples include : 

– Ford Mustang Mach-E, and F-150 Lightning have ‘BlueCruise’ which offers hands-free driving on pre-mapped highways, adaptive cruise control, lane-keeping, and speed sign recognition. 

– General Motors (GM) Chevrolet Bolt EV, Cadillac LYRIQ, and the GMC Hummer EV have ‘Super Cruise’ which offers hands-free driving on compatible highways, lane change on demand, and automatic lane-centering. 

– BMW iX and i4 models have ‘Driving Assistance Professional’ which offers adaptive cruise control, lane-keeping assist, and Traffic Jam Assist. 

– Mercedes-Benz EQS and EQE models feature ‘Drive Pilot’, a (Level 3) autonomous driving system in specific conditions, primarily on highways. 

– Audi e-tron and Q4 e-tron models feature Traffic Jam Pilot (available in limited markets), (Level 3) autonomous driving in traffic jams on certain roads. 

– Other makes/models that have similar automatic driving assistance features include Nissan (Ariya), Hyundai/Kia (Hyundai Ioniq 5 and Kia EV6), Lucid Motors (Lucid Air), and Rivian (R1T and R1S). 

No Vehicle On The Market Is Fully Self Driving 

Despite many makes/models offering advanced driver-assistance systems, with some claiming to have full autopilot or full self-driving capabilities, it’s essential to note that no vehicle on the market is truly “full self-driving” as defined by the highest levels of autonomous driving (Level 4 or 5). This is where no human intervention is required. Most systems are classified as Level 2 or Level 3, which still require driver supervision. 

Levels 

To briefly summarise what each driver assistance level actually means: 

– Level 0 means no automation, i.e. the human driver is entirely responsible for controlling the vehicle. 

– Level 1 (Driver Assistance) is where the vehicle can assist with either steering or acceleration/deceleration using information about the driving environment, but not both simultaneously – e.g. cruise control. 

– Level 2 (Partial Automation) means the vehicle can control both steering and acceleration/deceleration, but the human driver must monitor the driving environment and be ready to take control at any time. One important and relevant example of this is Tesla’s Autopilot. 

– Level 3 (Conditional Automation) refers to the vehicle being able to handle all aspects of driving in certain conditions but the human driver must be ready to intervene when requested, e.g. Audi’s Traffic Jam Pilot. 

– Level 4 (High Automation) means the vehicle can perform all driving tasks in specific conditions without human intervention. Human driver control is only needed outside these conditions. 

– Level 5 (Full Automation), which no vehicle on the market currently has, means the vehicle can handle all driving tasks under all conditions, without any human intervention. 

Tesla’s Autopilot & FSD 

As stated above, Tesla’s Autopilot, which was reportedly being used by some drivers involved in fatal collisions, is only level 2 automation, i.e. partial automation where the driver must monitor what’s happening and be ready to take control. 

The so-called Full Self-Driving (FSD) Package from Tesla includes advanced features like Navigate on Autopilot, Auto Lane Change, Autopark, Summon, and Traffic Light and Stop Sign Control. It is an upgrade over the standard Autopilot, which includes basic adaptive cruise control and lane-keeping. However, despite its name, FSD is NOT fully autonomous and still requires driver-supervision. In fact, it is considered Level 2 automation, where the system can control both steering and acceleration/deceleration, but the driver must remain attentive and ready to take control.

 

Tesla says that its Autopilot mode is “intended to be used only with a fully attentive driver. It does not turn a Tesla into a self-driving vehicle, nor does it make a vehicle autonomous”. Tesla also states that “before enabling Autopilot”, you must agree to “keep your hands on the steering wheel at all times” and to always “maintain control and responsibility for your vehicle.” 

Driver Error? 

With Tesla’s Autopilot and FSD clearly not making a vehicle fully autonomous, despite the ‘brand names’ of the features containing words that some may assume could suggest more autonomy, many of the reports of accidents do appear to show drivers doing other things and perhaps not being attentive at the wheel. For example, in the case of Jeff Nissen’s death (the 28-year-old motorcyclist of Stanwood, Washington, reportedly hit by a Tesla on Autopilot), it was reported that by the driver’s own admission, his attention was elsewhere (checking his phone). Also, in a crash in March 2018 involving Apple engineer Walter Huang, who was driving his Tesla Model X on Autopilot, it was reported that the driver was playing a video game on his phone at the time, i.e. a lack of driver attention may have affected the severity of the crash.

  

It was also reported in March that six weeks before the first fatal U.S. accident involving Tesla’s Autopilot in 2016, Tesla’s president Jon McNeill tried a Model X and emailed feedback to automated-driving chief Sterling Anderson (cc’ing Elon Musk) saying (March 25, 2016): “I got so comfortable under Autopilot, that I ended up blowing by exits because I was immersed in emails or calls (I know, I know, not a recommended use)”.  

It should also be noted that Tesla vehicles today use a combination of visual and audible alerts to prompt the driver to pay attention to the road. In fact, recent updates have also incorporated driver monitoring through the cabin-facing camera to detect if the driver is looking away from the road for too long.

 

Systems At Fault? 

Some people blame the vehicle manufacturers for perhaps leading drivers to be overconfident in the driver assistance systems and others have suggested that the systems themselves may not work as they should. For example, in the case of the motorcyclist killed in a 2022 crash involving a Tesla Model 3 on Autopilot in Utah, the motorcyclist’s parents have sued Tesla (and the vehicle’s driver), claiming that the driver assistant software and other safety features are “defective and inadequate”. 

Why? 

A recent Wall Street Journal article investigated “The Hidden Autopilot Data That Reveals Why Teslas Crash” which looked at alleged safety concerns with Tesla’s Autopilot. It highlighted issues with Tesla’s camera-based system, showing that it struggles in low visibility and obstacle detection. According to the article, an analysis of over 200 crashes revealed problems like sudden veering and failure to stop. Driver over-reliance on Autopilot and “phantom braking” were also highlighted as significant concerns. The investigation used data and video from crashes and emphasised a need for improved safety measures and transparency from Tesla. 

Driverless Services

As mentioned, fully autonomous vehicles, are not yet (widely) available to the general public. However, there are some pilot programs and limited deployments are taking place in certain locations for driverless services. Here are some notable examples:

Waymo (a subsidiary of Alphabet Inc.) has been testing fully autonomous vehicles in several locations, including Phoenix, Arizona. They offer a limited public ride-hailing service called Waymo One, where some rides are conducted without a safety driver in the vehicle. They have encountered some issues recently, leading to increased scrutiny and regulatory actions. The National Highway Traffic Safety Administration (NHTSA) launched an investigation into Waymo following 22 incidents where its autonomous vehicles were involved in collisions or potentially violated traffic laws.

These incidents included crashes with objects such as gates and parked cars and instances where the vehicle’s automated driving system appeared to disregard traffic control devices. Despite these issues, the company has stated that it is proud of its safety record, having driven tens of millions of autonomous miles.

Waymo is cooperating with NHTSA to address these concerns​ although Waymo has also been subject to recalls. In June 2024, NHTSA obtained a voluntary update from Waymo to address a defect in its software that affected its ability to accurately detect and respond to poles near the driving path. This recall was part of the regulatory body’s approach to ensuring the safety of automated driving systems.

Cruise (backed by General Motors) operated autonomous vehicles in San Francisco although recently faced significant challenges, leading to a suspension of its driverless car operations nationwide. This decision came after a series of incidents involving Cruise vehicles, including a notable accident in San Francisco where a pedestrian was injured. The National Highway Traffic Safety Administration (NHTSA) launched a federal investigation into these incidents, prompting Cruise to pause operations to reassess and improve its safety protocols. The company announced that it is taking a proactive approach to rebuild public trust by examining its processes and ensuring safety is prioritised.

Apollo (part of Baidu) offers the ‘Apollo Go’ service in China (in cities like Beijing, Changsha, Cangzhou and Wuhan) providing fully autonomous rides in specific zones. Apollo has experienced some issues as it continues to expand its robotaxi services. For example in Wuhan, Baidu has deployed a large fleet of robotaxis and while these vehicles only make up a small portion of the city’s total taxis, they have been causing significant traffic problems.

The robotaxis are reported to drive too cautiously, leading to traffic jams and frustration among residents. Despite these issues, Baidu has made substantial progress in scaling its operations and has been expanding rapidly (it was the first to offer 24/7 service in China​) and has been aggressive in its rollout strategy, which includes heavily discounting rides to compete with traditional taxis. However, this approach has raised concerns about the long-term viability of its business model. Additionally, there have been incidents involving Apollo Go vehicles, such as minor accidents with pedestrians, which have stirred public debate and highlighted ongoing safety and integration challenges​​.

Overall, while Baidu’s Apollo program is advancing rapidly, it faces challenges related to traffic integration, safety, and economic sustainability as it works to improve its autonomous vehicle technology​.

Zoox (an Amazon subsidiary) has been testing its custom-built autonomous vehicles for public transport in specific areas. It has recently faced regulatory scrutiny and challenges in its operations. The NHTSA has opened an investigation into Zoox following incidents involving unexpected braking, which may pose rear-end crash risks. These incidents involved Zoox vehicles equipped with their automated driving system and occurred during daylight within the operational limits of the system. The investigation aimed to assess the performance of Zoox’s Automated Driving System, particularly concerning crosswalk behavior and rear-end collision scenarios​​. In response, Zoox has stated its commitment to transparency and collaboration with regulators to address these concerns.

Zoox has been expanding its vehicle testing in various locations, including California, Nevada, Austin, and Miami. Despite the investigation, the company continues to explore new markets and refine its technology​. However, as Zoox expands, it faces the ongoing challenge of integrating autonomous vehicles into urban environments.

Motional (a joint venture between Hyundai and Aptiv) is testing fully autonomous vehicles in Las Vegas and plans to offer a commercial service in partnership with Lyft. However, the company laid off about 40% of its workforce and announced plans to pause some of its robotaxi deployments, including those with Uber and Lyft, amid restructuring efforts​.

Despite these setbacks, Hyundai plans to invest nearly $1 billion to support Motional, aiming to keep the company viable as it works toward launching a robotaxi service using driverless Hyundai Ioniq 5 vehicles​​. Motional continues to test its vehicles in multiple cities, including Boston, Las Vegas, and Los Angeles​​.

These examples illustrate that while fully autonomous vehicles are being tested and deployed in certain controlled environments and fully-autonomous driverless services are available, completely autonomous vehicles (as such) are not yet widely available for everyday public purchase. Regulatory, technological, and safety challenges still need to be addressed before they can be purchased (mainstream) and adopted to run fully independently.

What Does This Mean For Your Business? 

The debate on whether automated driving is safe has significant implications for various stakeholders. For manufacturers, the mounting incidents could suggest the necessity of rigorous testing and transparent communication about the capabilities and limitations of automated systems. While the technology promises enhanced safety and convenience (and supposedly greater safety on the roads), it may now be worth looking more closely at the apparent shortcomings identified in real-world applications, such as obstacle detection in low visibility and system reliability. This may not just involve refining the technology but also setting realistic expectations for consumers to prevent misuse and over-reliance. 

For businesses relying on these technologies, such as logistics and ride-sharing companies, understanding that no current vehicle is fully autonomous (Level 4 or 5) is vital. The systems available (primarily Level 2 and some Level 3) require constant driver supervision. Educating drivers about their responsibilities and ensuring adherence to safety protocols could therefore mitigate risks. Anyone using these vehicles must always remain vigilant, keeping their hands on the wheel and their attention on the road at all times, as highlighted by the frequent accidents due to driver inattention. 

In the UK, fully autonomous vehicles are not yet permitted on the roads without a special testing arrangement. The Automated Vehicles (AV) Bill, for example, aims to create a framework for the deployment and insurance of automated vehicles, which could influence future regulations and safety standards. 

For the courts, the increasing number of incidents involving Tesla’s Autopilot is prompting deeper scrutiny. Legal cases are examining whether the marketing of these systems leads to driver overconfidence and misuse. The outcomes of these cases could set precedents affecting how manufacturers communicate the capabilities of their automated systems and the degree of responsibility they bear. 

While automated driving systems have made significant advancements, claiming enhanced safety compared to human drivers, the technology is apparently not without flaws. The current systems require human supervision, and accidents often appear to involve a combination of factors such as driver attention, potential system faults, and road conditions. As such, while automated driving can offer safety benefits, it is not yet foolproof, and users must remain actively engaged. 

Automated driving, therefore, presents both opportunities and challenges. The technology is advancing, but it demands responsible use, continued innovation, and comprehensive regulatory frameworks to ensure it truly enhances road safety. The growing body of evidence from incidents and legal actions suggests a cautious and informed approach is necessary to navigate the path towards fully autonomous driving.

City Skyline

For more information on our services give us a call on 01603 859669 or send us an enquiry

bottom of page