What is needed to achieve "vision zero" of the car

What is needed to achieve "vision zero" of the car
About the author Chris Jacobs joined ADI in 1995. During his tenure with Analog Devices, Mr. Jacobs held various positions in design engineering, design management, and business management within Consumer Equipment, Communications, Industrial, and Automotive. . Chris Jacobs is currently Vice President of the Autonomous Transportation and Vehicle Safety Business Unit at Analog Devices. Previously, Jacobs was General Manager of Automotive Safety, Director of Precision Converter Products and Technology, and Product Line Director for High Speed ​​Converters and Insulation Products. Traditional driving may soon be considered archaic. A disruptive development is shifting from human powered vehicles to autonomous vehicles that require a global ecosystem to stimulate development and create a monumental structural transformation of a high percentage of the global economy. However, safety remains a major hurdle in overcoming this ecosystem before driverless existence becomes a reality. Every day in the world, more than 3.000 traffic accidents die. Removing humans from the equation is one way to solve this problem. As a result, technology providers, leading vendors, original equipment manufacturers (OEMs), and automakers are embracing new business models and placing big bets on accelerating the maturation of independent technologies. the key. driving technologies. The goal is to achieve vision zero, which aims to prevent loss of life caused by vehicles, so autonomous deployments hope to reach their full potential.

Basic sensor technologies help achieve a wider range of vehicles.

Vehicle intelligence is often expressed in range levels. Levels 1 and 2 are largely warning systems, where, from level 3, the vehicle can act to prevent accidents. As the vehicle progresses to level 5, the steering wheel is removed and the car runs autonomously. In the first generations of systems, when vehicles start to have level 2 functionality, the sensor systems work autonomously. To reach fully autonomous cognitive vehicles, the number of sensors is greatly increased. Its performance and response times should also be significantly improved. Vehicles equipped with external sensors can be more aware of their surroundings and therefore safer. Critical technologies in AI systems capable of navigating an autonomous vehicle include cameras, LiDAR, RADAR, microelectromechanical systems (MEMS inertia), ultrasound, and GPS. In addition to supporting the perception and navigation systems of an autonomous vehicle, these sensors allow better monitoring of mechanical conditions (tire pressure, weight change, etc.), as well as other maintenance factors that can affect engine functions such as braking and handling. Although such sensors and sensor fusion algorithms can contribute to the realization of vision zero, several factors must be taken into account, the first of which is object classification. Current systems cannot achieve the proper resolution required for object classification, but RADAR, given its micro-Doppler capabilities, performs better in this area. Although currently a mainstream feature in autonomous vehicles, RADAR will become increasingly common as the AEM (Automatic Emergency Braking) mandate becomes a reality in early 2020. LiDAR, for its part, is not a standard feature in cars today because its cost and level of performance do not justify wider adoption. However, LiDAR will offer an image resolution 10 times that of RADAR, needed to discern even more nuanced scenes. Achieving a high-quality, high-sensitivity solution with low dark current and low capacitance is the key technology to enable the LiDAR market at 1500 nm, which could lead to its further adoption. A key technology in this area is semiconductor beam monitoring, as high-sensitivity, high-cost photodetector technology is required to drive the market forward at 1500 nm.

Image Credit: Shutterstock Image Credit: Shutterstock (Image: © Shutterstock) Camera systems, commonly used in new vehicles, are the cornerstone of Level 2 autonomy. However, these systems do not perform well in all use cases (ie say, say night and bad weather). Ultimately, these sensing technologies are needed to provide the most comprehensive data set for systems designed to ensure the safety of vehicle occupants. Although often neglected, IMUs rely on gravity, which is constant regardless of environmental conditions. As such, they are very useful for calculating kills. In the temporary absence of a GPS signal, the reckoning uses data from sources such as speedometers and UMIs to detect distance traveled and direction, and overlays them. high definition maps. This keeps the Cognitive Vehicle on the correct trajectory until a GPS signal can be recovered. Sensor fusion can complement the disadvantages of perceptual sensing systems. This requires a smart balance between core processing and advanced processing to pass the data to the merge engine. LiDAR cameras and sensors offer excellent lateral resolution, but even the best machine learning algorithms require approximately 300ms to perform lateral motion detection with low false alarm rates. In current systems, approximately 10 successive frames are required for reliable detection with sufficiently low false alarm rates. This should be reduced to 1-2 successive frames to give the vehicle more time to take the necessary preventative measures. New technologies, which enable advanced perception capabilities at high speeds, need to be developed and developed to enable fully autonomous driving in both urban and highway conditions. However, the more we work on it, the more we will identify the complex use cases that will be covered. Furthermore, inertial navigation will be an essential aspect of autonomous vehicles for the future, as these systems are insensitive to environmental conditions and necessary to complement perception sensors, which can be modified in certain situations.

The role of ADAS and full autonomy.

Another important non-technical factor to consider in achieving the Vision Zero goal is striking a balance between what technology can do and what legislation will do. Today, industry leaders are following two tracks: Advanced Driver Assistance Systems (ADAS) and fully autonomous vehicles. Although the auto industry feels ADAS safer than fully autonomous vehicles, ADAS technology is still not perfect. Tier 1 automotive suppliers and suppliers are currently focusing on Tier 2 or 3 self-sufficiency because they see good business opportunities there. Legislation for highly autonomous vehicles is yet to be defined and other areas, such as insurance and regulation, need to be further developed to establish a proper framework. Robot taxis, for example, are about to debut in several cities in the United States. These vehicles are likely to be added to the larger existing Tier 2 or 3 applications. Much remains to be done to improve the performance of specific detection technologies such as radar and LiDAR, as well as various algorithms that trigger cars and conditions. When we get to 2020 and beyond, where AEB becomes a standard feature in cars, we officially start moving to level 3 autonomy. However, more improvements are needed to get from where automakers are to where they should be.

Image Credit: Shutterstock Image Credit: Shutterstock (Image: © Shutterstock) OEMs are really embracing the two-way dynamic. For example, with robo-taxis, they consider that the economy of this company is totally different from that of the automobile market, insofar as it deals with vehicle sharing services. One of the other dynamics of this specific market allows OEMs to use advanced technologies in these vehicles to develop hardware, software, and the sensor fusion framework. While OEMs have more confidence in ADAS, there are more cases where separate companies have been created to account for longer vehicle ranges. However, some OEMs do not have the research and development capital to take this course, instead partnering with other companies that specialize in autonomous driving technologies. In the middle of this two-way system is the 3+ level of autonomy. Although not completely self-contained, Level 3+ is more advanced than existing ADAS systems, combining advanced performance features with practical features. Much better sensors are needed to support level 3+ applications such as high speed autopilot and AEB+ when the vehicle brakes but also swerves to avoid accidents. Level 3+ features highly autonomous technologies, including a critical sensor framework that lays the foundation for future fully autonomous vehicles. Although we have not achieved total autonomy, the automation of Level 3+ brings us closer to the goal of Vision Zero, which combines functionality and performance, combining the developments of the two. Ways to develop a safe transportation ecosystem. This is the tipping point where independent technology becomes much more capable and available to the public.

Journey to vision zero

No matter how different industry leaders approach Vision Zero, a variety of high-performance navigation and perception sensors help us achieve it. In addition, the high-quality data generated by these sensors ensures that the decision-making software makes the right decision every time. The journey towards Vision Zero and full autonomy follow the same path. Any player in the ecosystem should keep this in mind in the coming years, as the goal of autonomous vehicle development is to usher in a new era of technology and business model, as well as to save lives. Chris Jacobs, Vice President of Autonomous Transportation and Automotive Safety at Analog Devices