• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

Microcontroller Tips

Microcontroller engineering resources, new microcontroller products and electronics engineering news

  • Products
    • 8-bit
    • 16-bit
    • 32-bit
    • 64-bit
  • Applications
    • 5G
    • Automotive
    • Connectivity
    • Consumer Electronics
    • EV Engineering
    • Industrial
    • IoT
    • Medical
    • Security
    • Telecommunications
    • Wearables
    • Wireless
  • Learn
    • eBooks / Tech Tips
    • EE Training Days
    • FAQs
    • Learning Center
    • Tech Toolboxes
    • Webinars/Digital Events
  • Resources
    • Design Guide Library
    • DesignFast
    • LEAP Awards
    • Podcasts
    • White Papers
  • Videos
    • EE Videos & Interviews
    • Teardown Videos
  • EE Forums
    • EDABoard.com
    • Electro-Tech-Online.com
  • Engineering Training Days
  • Advertise
  • Subscribe

ADAS developers contemplate sensor fusion

August 8, 2017 By Lee Teschler 1 Comment

Semiconductor makers now field sensor equipment complete with built-in safety systems that specifically target autonomous driving applications.

LANCE WILLIAMS | ON Semiconductor

Increasingly sophisticated advanced driver assistance systems (ADAS) are helping cars move up through defined levels of autonomy toward the day when fully autonomous driving is a reality. ADAS is an automotive megatrend that will enable the biggest change in how the world’s population gets from A to B since the inception of commercial flight.

In the automotive sector, almost all the discussion is about the fully autonomous or driverless vehicle. Safety is the key driver in the march to make vehicles more autonomous. WHO 2017 figures show 1.25 million people die in road traffic accidents each year and a further 20 to 50 million people are injured or disabled. The cause of these road traffic accidents is overwhelmingly human error. The U.S. Dept. of Transportation has found that 94% of the two million accidents happening annually were caused by driver mistakes.

ADAS view
Here’s how an ADAS might size up a view through a typical windshield. Cameras on the vehicle would feed the view to image processors that would recognize vehicles, lane divider lines, and road signs. Radar sensors would gauge factors such as the relative speeds and trajectories of the recognized vehicles. The ADAS processor would fuse this information with other data such as the distance to a destination to make decisions about how to guide the vehicle.

Research began in the 1980s and today there are about 50 companies actively developing autonomous vehicles, as well as numerous university projects pushing boundaries. Autonomous emergency braking, lane departure warning systems, active cruise control, as well as blind spot monitoring are common features on vehicles produced today, and automated parking is proliferating at pace. These systems provide valuable input to the driver, who ultimately remains in control, for now.

Autonomous driving levels
Progression through the levels of ADAS has brought us to a tipping point for the industry.

As ADAS evolves, the industry is reaching a tipping point where the vehicle itself is providing integrated monitoring. Companies like ON Semiconductor are addressing the trend through both hardware and software innovation and support for other activities in ADAS.
Sensor fusion is the key to passing this tipping point. Diverse systems in the vehicle are becoming linked, boosting the ability to make more complex, safety-critical, decisions and providing a redundancy that will help prevent errors that could lead to accidents.

SENSOR FUSION

Vision is an increasingly important facet of vehicle technology. Vision sensors now support active safety features that include everything from rear-view cameras to forward-looking and in-cabin ADAS. Engineers combine and process the data from multiple types of sensors to ensure correct decisions, responses and adjustments. This approach may include combinations of image sensors, radar, ultrasound and lidar.

There is a safety standard called ISO 26262 that applies to ADAS. ISO 26262 covers the functional safety of vehicle electronic systems. It is the vehicular-specific extension of a broader safety standard called ISO 61508. Functional safety focuses primarily on risks arising from random hardware faults as well as systematic faults in system design, hardware, software development, in production, and so forth.

Under this standard, manufacturers identify hazards for each system, then determine a safety goal for each of them. Each safety goal is then classified according to one of four possible safety classes, called Automotive Safety Integrity Levels (ASIL). Vehicle safety integrity levels cover the range ASIL-A (lowest), to ASIL-D (highest). An ASIL level is determined by three factors: severity of a failure, the probability of a failure happening, and the ability for the effect of the failure to be controlled.

ADAS latency levels
Latency is a critical parameter in fault detection processes.

Functional safety starts at the sensor. Issues such as latency, and high-speed fault detection are given close attention by automotive OEMs, Tier-Ones and sensor makers alike. There can be catastrophic implications if faults in image sensors used for ADAS go undetected, especially for systems such as adaptive cruise control, collision avoidance and pedestrian detection.

The process of detecting one of the thousands of potential failure modes is processor-intensive, requiring an algorithm for each fault. In fact, some faults are impossible to detect at the system level. Latency within fault-tolerant systems is a primary concern for all system designers – simply put, this is the time between when a fault occurs and when the system returns to a safe state. For safety, the fault must be detected and addressed before it leads to a dangerous event.

A typical camera system designed for prototyping aspects of ADAS vision systems might take the form of this Modular Automotive Reference System (MARS) from ON Semiconductor. Most camera systems targeting use in vehicles have a configuration resembling that of MARS: Here a camera module contains the camera and lens, and a serializer for converting camera data to either LVDS (low-voltage differential signaling) signals or Automotive Ethernet. In applications where humans will view the image, the camera module might also contain an image coprocessor for operations such as correcting lens distortions, tone mapping, and other image pipeline functions. Camera data gets sent back to a controller via serial lines because the camera module might lie 10 m or more away from the controller, making bus connections impractical. At the controller, data from multiple cameras get aggregated and deserialized.

Vision sensors are becoming more advanced, and functional safety fault detection is moving from the ADAS to the sensor itself. Detection is built-in and faults are identified by design. The benefit of sensor-based detection is better fault detection as well as less pressure on the ADAS processing capacity.

Even today, many ADAS struggle to meet ASIL-B compliance. In the near-term, the number of systems required to meet ASIL-B compliance will rise dramatically. Future ADAS will need to meet ASIL-C and ASIL-D compliance if widespread use is to become reality. ON Semiconductor is already active in this area. It has equipped many of its image sensors with built-in sophisticated safety mechanisms to ensure complete functional safety.

Driven by the stringent demands of automotive operating environments, image sensors for ADAS are adding features such as light flicker mitigation (LFM) – which overcomes issues of misinterpretation of scenes caused by front or rear LED lighting in the sensor’s field of view, superior infrared performance, and the ability to work in either extremely bright or low-light conditions.

All in all, advanced and functionally safe sensors will be at the heart of ADAS. The fusing of these sensors with other in-vehicle technology and ensured cyber security will move the industry past the tipping point where vehicles can become truly autonomous.

You may also like:


  • Sensing is key to autonomous operation

  • Researchers find a way to confuse autonomous vehicle cameras
  • mentor centralized approach
    Centralized processing for autonomous vehicles
  • edge_case_predictions
    Simulations help explain why autonomous vehicles do stupid things

  • The danger of treating autonomous vehicle control as a solved…

Filed Under: Applications, Automotive, Embedded, Featured Tagged With: onsemiconductor

Reader Interactions

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Primary Sidebar

Featured Contributions

Five challenges for developing next-generation ADAS and autonomous vehicles

Securing IoT devices against quantum computing risks

RISC-V implementation strategies for certification of safety-critical systems

What’s new with Matter: how Matter 1.4 is reshaping interoperability and energy management

Edge AI: Revolutionizing real-time data processing and automation

More Featured Contributions

EE TECH TOOLBOX

“ee
Tech Toolbox: 5G Technology
This Tech Toolbox covers the basics of 5G technology plus a story about how engineers designed and built a prototype DSL router mostly from old cellphone parts. Download this first 5G/wired/wireless communications Tech Toolbox to learn more!

EE Learning Center

EE Learning Center

EE ENGINEERING TRAINING DAYS

engineering
“bills
“microcontroller
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest info on technologies, tools and strategies for EE professionals.

DesignFast

Design Fast Logo
Component Selection Made Simple.

Try it Today
design fast globle

Footer

Microcontroller Tips

EE World Online Network

  • 5G Technology World
  • EE World Online
  • Engineers Garage
  • Analog IC Tips
  • Battery Power Tips
  • Connector Tips
  • DesignFast
  • EDA Board Forums
  • Electro Tech Online Forums
  • EV Engineering
  • Power Electronic Tips
  • Sensor Tips
  • Test and Measurement Tips

Microcontroller Tips

  • Subscribe to our newsletter
  • Advertise with us
  • Contact us
  • About us

Copyright © 2025 · WTWH Media LLC and its licensors. All rights reserved.
The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media.

Privacy Policy