MEDICAL MALPRACTICE AND PERSONAL INJURY LAW BLOG

  • aba
  • aaj
  • superlawyers
  • BBB
  • AVVO
  • icoa

Autonomous Vehicles Still Spark Fear

Elaine Herzberg was struck and killed by a car, like thousands of pedestrians in the United States each year. The year she was struck, 2018, the Governors Highway Safety Association reported that about 6,227 pedestrians were killed in motor vehicle crashes across the country. Her death, however, was unique and led to a massive outcry and accompanying policy changes. Ms. Herzberg was struck and killed by an Uber autonomous vehicle

The death prompted citizens across the country—especially those in the cities in which Uber was “testing” its fleet of autonomous vehicles—to question the practice of allowing a corporation to make a city its testing grounds. This practice jeopardizes pedestrians, cyclists, and other drivers who were not given the opportunity to consent to play a role in the autonomous vehicle experiment.

At the time of Elaine Herzberg’s death, autonomous vehicles were being tested in Tempe, Pittsburgh, San Francisco, and Toronto and Uber removed all self-driving vehicles from the roads in the aftermath of the pedestrian fatality. However, self-driving vehicles have made their way back to city streets, although not with the full, unregulated access they once enjoyed. For example, in Pittsburgh, autonomous Uber vehicles are permitted to use the roadways, but an executive order from the mayor requires that two individuals are inside the vehicle at all times. Additionally, the vehicles are no longer allowed to ferry customers of the ride-sharing app, though Uber has announced that it eventually will be reintroducing the self-driving cars to its Pittsburgh passenger fleet.

Last week, the National Transportation Safety Board (NTSB) released a report examining the Tempe crash. The report found that the software in the self-driving vehicles was not equipped to handle some situations it would likely encounter, such as a person jaywalking. Furthermore, the report found that the software was unable to distinguish types of objects it might encounter on or near the roadways.

Self-driving cars currently use a combination of radar, LiDar  (sensors that use laser lights to map the surrounding environment), and high-definition cameras to map their surroundings. When the vehicles encounter a new object, images are rapidly processed by the vehicle’s Artificial Intelligence based upon a repository of reference images of similarly labeled objects to determine how the vehicle should react to the object. 

Currently, the vehicles do not seem to be able to determine the difference between many different types of objects, which means that they may not be programmed to react correctly. The NTSB investigation revealed that the vehicles are currently unable to tell the difference between a cyclist and a pedestrian. Many advocacy groups have expressed concern over the findings of the NTSB report and point to the possible dangerous outcomes that could arise from the inability to distinguish objects.

About the Author

Charles GilmanCharles Gilman
Charles Gilman

As managing partner and co-founder of Gilman & Bedigian, it is my mission to help our clients recover and get their lives back on track. I strongly believe that every person who is injured by a wrongful act deserves compensation, and I will do my utmost to bring recompense to those who need and deserve it.

COMMENTS

There are no comments for this post. Be the first and Add your Comment below.

LEAVE A COMMENT

Your email address will not be published. Required fields are marked *

    Contact Us Now

    Call 800-529-6162 or complete the form. Phones answered 24/7. Most form responses within 5 minutes during business hours, and 2 hours during evenings and weekends.





    100% Secure & Confidential

    Menu

    Generic selectors
    Exact matches only
    Search in title
    Search in content
    Search in posts
    Search in pages

      100% Secure & Confidential