Autonomous vehicles can be trained to be safer than human-controlled car?
Autonomous vehicles can be trained to be safer than human-controlled car
The autonomous vehicle is the next big thing for consumers in 2019. There are already a handful of companies developing this self-driving technology which has led many to believe that autonomous cars will begin hitting retail next year. But what type of autonomous car will reach the market in 2022? There are 3 types of autonomous cars fully autonomous/full driverless cars, partially autonomous/driverless cars, and semi-autonomous cars. All of these types of vehicles have certain technical features or functionalities. For example, semi-autonomous cars with a camera and laser scanner will be able to detect their surroundings more accurately. With this feature, one could easily move around without any human monitoring. Other advanced technology like automated parking and automatic lane changes will also make it possible for people to ride an autonomous cab without having to sit behind a seat and watch it. As mentioned above, driverless / fully autonomous cars can be configured to be safer than human-controlled cars. One of such systems is collision avoidance. The software can then train itself to avoid collisions when human drivers were present at the wheel. It can also monitor the weather conditions and even take into account the fact if the weather condition might affect the vehicle. Since there are no humans involved and human driving, the possibility of accidents is reduced in such situations, resulting in reduced risk.
While autonomous vehicles are being developed by various leading car companies, they do not have all the functionality of human-controlled ones. This is partly because of the size and battery consumption. Many manufacturers want their products to be affordable, but this does not mean that the cost of maintaining them will reduce dramatically. However, some automakers think differently. They are offering high-end consumer-level solutions, meaning the vehicle can be operated by passengers with minimum supervision. Another reason why the average person does not prefer completely autonomous cars is due to privacy concerns. For such vehicles to become a reality, we need the infrastructure that allows everyone to use the same data to control the vehicles.
As far as public safety and regulation are concerned, autonomous cars should follow the set guidelines. Governments should set the standards to protect the passengers’ lives when driving such cars. If any accident happens, the government is responsible for getting compensation from the manufacturer or owner of the vehicle. Automated machines and robots can help us learn faster and better how to drive our cars. In other words, they can give us useful hints on how to deal with unforeseen situations. This way, we will be in a much safer position and can make informed decisions on-road transportation.
The NTSB also noted that individual states are creating their regulations due to the absence of federal regulations. Arizona, for example, has minimal restrictions that the NTSB said were at least partially responsible for a pedestrian fatality in 2018 by an Uber ADS. Florida statute 316. 85 specifically allows the operation of autonomous vehicles and explicitly states that a driver does not need to pay attention to the road in a self-driving vehicle (e. g., the driver can watch movies). It also explicitly permits autonomous vehicle operation without a driver even present in the vehicle. And there are no requirements for manufacturers to pass safety tests beyond the requirements that were in place before self-driving capabilities
AV - The very suitable solution there are three very different types of safety testing needed for autonomous vehicles. The first is to ensure that all the components that feed information into AV decision-making are working correctly. In Taiwan two years ago, a Tesla Model 3 in Autopilot mode crashed into an overturned tractor-trailer while going 70 mph. The crash was reportedly caused by a software failure in the car’s forward-facing sensor array, and this prevented the automatic braking from working properly. Testing of sensors should be adequate to prevent this type of failure
Automotive - The edge cases but will self-driving vehicles be safer? The biggest issue for the automotive industry revolves around handling unexpected situations that arise from edge cases. There are two new automotive safety standards, ISO 26262 and UL 4600, that attempt to address these edge cases. However, these standards are not prescriptive, and regulatory agencies are not requiring compliance with these or any other standard for autonomous vehicles. Worse, as I’ll explain below, there are good reasons to believe that some types of autonomous vehicles may not be capable of handling these edge cases
Teslas: The very suitable automobile many of the consumer vehicles on the road today, such as Teslas, have driver assistance capabilities. They can keep the vehicle centered in its lane, and they can accelerate and brake automatically. However, it is not safe for the driver to read a book or watch a movie. The driver must constantly monitor the road and be ready to take over control instantly. These are Level 2 vehicles and are not considered ADSs. Drivers must pay attention to the road and be ready to take over instantly. For example, driving Tesla on the highway in autopilot mode when it hit a big bump. Tesla swerved and went ding-ding-ding, which means “hey, you are on your own” and I had to react quickly to steer it back into my lane
Level 4 vehicles: a specific ODD?
Level 4 vehicles are restricted to a specific ODD. This will usually dramatically reduce the number of edge cases compared to Level 5 vehicles that have no ODD. For example, we are already seeing Level 4 point-to-point shuttles on corporate campuses that drive very slowly. These vehicles are unlikely to encounter many edge cases because there are not many unexpected things that can happen on a single road between two locations. And if something were to happen, the shuttles travel so slowly that there is little risk for passengers or pedestrians
The difference between Levels 4 and 5 is that Level 4 vehicles are restricted to an Operational Design Domain (ODD), which usually includes restricted geography (e. g. a limited set of streets in a city) and may include other restrictions based on weather, time of day, precipitation, road grades and curvature, and other factors. Level 5 vehicles can drive anywhere with no restrictions and are theoretically effective replacements for consumer vehicles and commercial trucks
E-commerce - Quand le Trafic shuttles Level 4 self-driving taxis for which the ODD is limited to specific city streets will encounter more edge cases than corporate shuttles but probably nowhere near the number of edge cases consumer vehicles might encounter. It is possible to maintain detailed maps (e. g. traffic lights and construction zones) by limiting the driving domain to specific streets. . In contrast, a Level 5 vehicle must be able to drive on every street in the world – or at least the consumer’s country. The Washington Post counted over one million roads in the US alone
In the jungle of the high-tech technologies innovations improve the safety of autonomous vehicles, too. For instance, the use of lidar lasers and assistive driving technologies improves the capability of self-driving systems. New assessment programs by agencies like the National Highway Traffic Safety Administration also address autonomous vehicle safety to make these new vehicles safer on the road. Distinguishing between passenger vehicles and freight vehicles, regulators can focus on creating top-tier safety solutions for solving safety issues for self-driving vehicles that will have people inside them
The Tesla owners sway their cars The differences in how self-driving cars perceive the world leads to concerns far beyond hackers. For example, in real-world driving, many Tesla owners have reported that shadows, such as tree branches, are often treated by their cars as real objects. In the case of the Uber test car that killed a pedestrian, the car’s object recognition software first classified the pedestrian as an unknown object, then as a vehicle, and finally as a bicycle.
Thanks 4 reading.!!