• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

Microcontroller Tips

Microcontroller engineering resources, new microcontroller products and electronics engineering news

  • Products
    • 8-bit
    • 16-bit
    • 32-bit
    • 64-bit
  • Applications
    • Automotive
    • Connectivity
    • Consumer Electronics
    • Industrial
    • Medical
    • Security
  • EE Forums
    • EDABoard.com
    • Electro-Tech-Online.com
  • Videos
    • TI Microcontroller Videos
  • EE Resources
    • DesignFast
    • eBooks / Tech Tips
    • FAQs
    • LEAP Awards
    • Podcasts
    • Webinars
    • White Papers
  • EE Learning Center

Data injection — Testing autonomous sensors through realistic simulation

August 14, 2019 By Lee Teschler Leave a Comment

Proof that autonomous vehicles are safe to put on the road begins with rigorous models of the sensors that guide them.

Holger Krumm | dSPACE GmbH
Onboard sensors are critical for autonomous vehicles to navigate the open road, but the real environment is full of obstacles. Everyday interferences such reflective surfaces, whiteout conditions, fog, rain, traffic congestion, and objects (i.e. pedestrians, parked vehicles, buildings, signs) clutter the atmosphere and can result in sensor misreads or false targets.

radar simulation
Simulations of radar sensors produce data representing the presence of objects, their distance from the vehicle, and their relative speed and direction.

The sheer volume of testing necessary to evaluate every possible driving scenario makes it impractical to conduct real test drives on the road. The solution lays with the creation of virtual driving scenarios and sensor-realistic simulation, which can take place in the safety of the lab.

An efficient approach is to validate autonomous driving functions by using realistic, off-the-shelf simulation models (i.e. such as dSPACE Automotive Simulation Models – (ASM) and a virtual testing platform. With a setup like this, entire test scenarios can be virtually reproduced, including the environment sensors (camera, radar, Lidar, etc.), the vehicle under test, traffic, roads, driving maneuvers, and the surrounding environment.

Sensor-realistic simulation is the most efficient way of verifying and validating the environment sensors onboard autonomous vehicles. The basic premise behind sensor-realistic simulation is that sensor models mimic real sensors by generating the same kinds of signals. The sensor models use a geometrical approach to calculate distance, velocity, acceleration, horizontal and vertical angles to the nearest point of every detected object. The software models generate raw data from the sensors to simulate the environment (i.e. traffic objects, weather, lighting conditions, etc.) to mimic the feedback a real vehicle would receive.

The sensor simulation process involves the following stages:

Sensing – The simulated sensors receive a signal representative of one or more objects. The sensors detect the virtual targets as they would real objects. The sensors begin to capture real-time information such as distance, angular position, range and velocity.

Perception – Through imaging or signal processing, the sensors recognize the presence of the object(s).

Data fusion – The validation process begins as the raw data collected from the various sensors feeds to the CPU of the electronic control unit (ECU). Here, the information is combined and processed (this is also known as sensor fusion) in real-time to create a target list (or point cloud) of objects, both static and moving.

Application – The object list is run through a perception algorithm where object classification, situation analysis, trajectory planning and decision-making activities take place. Based on the outcome, the ECU determines what actions the autonomous vehicle should take.

Actuation – The ECU sends output signals to the appropriate actuators to carry out the desired action.

simulation stages
Sensor simulation is often divided into five stages: sensing, perception, data fusion, application, and actuation.

For validation purposes, the sensor data gathered during the testing process must be recorded and stored in a time-correlated manner (i.e. time-stamped, tagged, synchronized) so it can be played back in the laboratory later.

It takes detailed and realistic models to address the high-complexity needs associated with autonomous sensor systems (i.e. decision algorithms, motion control algorithms). The more realistic the sensor model, the better the results. Depending on the level of complexity, sensor models can be grouped into three general types: Ideal ground truth/probabilistic; Phenomenological/physical; and Real/over-the-air (OTA).

Ideal ground truth/probabilistic sensor models are technology independent. They are primarily used for object-list-based injection (i.e. 3D and 2D sensors used to detect traffic lights, traffic signs, road objects, lanes, barriers, pedestrians, etc.). These kinds of models help check whether an object is detectable within a set range.

In a sensor simulation experiment, ground truth/probabilistic sensor models provide ideal data (ground-truth information), which can optionally be superimposed with probabilities of events (probabilistic effects). For example, superimposition is used to simulate the typical measurement noise of radar. The simulation returns a list of classified objects (vehicles, pedestrians, cyclists, traffic signs, etc.) as well as their coordinates and motion data (distance, relative speed, relative acceleration, relative azimuth and elevation angle).

Ideal ground truth/probabilistic sensor models are typically validated via software-in-the-loop (SIL) simulations, which are faster than real-time, and in hardware-in-the-loop (HIL) simulations, which take place in real-time. They can also be deployed on cluster systems to play a part in conducting high volumes of tests.

Within the dSPACE toolchain, these sensors are part of the ASM tool suite (e.g. ASM Ground Truth sensor models). They run on a CPU together with the vehicle, traffic and other relevant environment models. These models are easy to configure, and simulation always takes place synchronously.

Phenomenological/physical sensor models are physics-based. They are based on the measurement principles of the sensor (i.e. camera uptakes, radar wave propagation) and play a role simulating phenomena such as haze, glare effects or precipitation. They can generate raw data streams, 3-D point clouds, or target lists.

Because these models address physical effects, they can be quite complicated. Calculation typically takes place on a graphics processing unit (GPU). These models also are typically validated in a SIL or HIL test setup.

Within the dSPACE toolchain, phenomenological/physical sensor models are visualized in MotionDesk and calculated on a dSPACE Sensor Simulation PC, which incorporates a high-performance GPU card and can facilitate deterministic, real-time sensor simulations with a high degree of realism.

Real/over-the-air sensor models are also physics-based models. They are used in tests with real physical signals and real sensor ECUs to analyze real-world sensor behavior.

sensor model hierarchy
Sensor models ranging from ideal ground truth to real are used to validate autonomous sensor systems.

Validation can take place by stimulating the whole sensor over-the-air on a radar test system (i.e. dSPACE Automotive Radar Test Systems – DARTS). This approach is useful for object detection scenarios. Alternatively, validation can take place on a complete radar test bench when there is a need to integrate other vehicle components (i.e. front bumper, chassis).

Sensor-realistic simulations rely on a bus system like CAN, CAN FD, FlexRay, LIN or Ethernet that carries signals and vehicle network traffic. So any autonomous vehicle simulations must include simulations of bus behavior, ranging from simple communication tests and rest bus simulation to complex integration tests.

Additionally, the sensor model must interface to the device-under-test to inject data for simulation testing. A high-performance FPGA can be used to feed raw sensor data, target lists and/or object lists into the sensor ECU in a synchronized manner. For example, the dSPACE Environment Sensor Interface (ESI) Unit was designed for this role. It receives raw sensor data, separates it according to the individual sensors, and then inserts the time-correlated data into a digital interface behind the respective sensor front end.

Other interfaces that are supportive of autonomous driving development include FMI, XIL-API, OpenDrive, OpenCRG, OpenScenario or Open Sensor Interface. These interfaces give engineers the option of integrating valuable data from accident databases or traffic simulation tools for co-simulation activities.

With regard to cameras, the preferred practice today is to use over-the-air stimulation to feed raw image data directly into the camera’s image processing unit. As the camera sensor captures the image data stream, the animated scenery is displayed on a monitor and engineers can detect ranges and sensor outputs to the nearest point of an object (i.e. distance, relative velocity, vertical and horizontal angles, etc.), as well as sensor timings (i.e. cycles, initial offset, output delay time).

To validate camera-based sensors, simulations must account for different lens types and distortions – such as fish-eyeing, vignetting and chromatic aberrations. Additionally, test scenarios must factor in the use of multiple image sensors, as well as sensor characteristics (i.e. monochromatic representation, Bayer pattern, HDR, pixel errors, image noise, etc.).

The next generation of sensor simulation products is capable of producing highly realistic visualization using technologies such as 3-D remodeling, physics-based rendering, ray tracing, and dynamic lighting. The detail of different terrains, environmental lighting (i.e. haze, shadows), lens flare, glare effects, dynamic materials (i.e. rain, snow, fog), and much more can have 3D photo-realistic qualities to help boost simulation accuracy.

Simulations of sensors such as Lidar and radar generally detect objects by computing the path of reflections for Lidar signals or how electromagnetic waves will propagate for radar. The procedure involves sending beams into a 3D scene and capturing their reflections, which allows for the integration of physical effects such as multipath propagation into the modeling. The result is a physically correct simulation of the propagation of radar waves or a near-infrared laser beam.

Values for parameters such as reflection points, angles, distance, range, Doppler speed, diffuse scattering, multipath propagation, and so forth are collected and processed to calculate the vehicle’s distance from an object and to describe the surrounding environment (i.e. in the form of a point cloud). Gathered data is used to generate a target list that includes information on distance and the intensity of the reflected light (for Lidar sensors) or the frequency of an echo signal (for radar). The resulting sensor-realistic simulations help validate the behavior of the sensor path.

Tools for sensor-realistic simulation

Models supporting virtual test drive scenarios, for example, driving maneuvers, roads, vehicles, traffic objects, sensors, etc. (i.e. dSPACE ASM)
Software to support the 3-D animation and visualization of simulated test drive scenarios (i.e. dSPACE MotionDesk)
An FPGA powerful enough to synchronously insert raw sensor data into sensor ECUs (i.e. dSPACE ESI Unit)
A processing platform with a high-performance GPU to calculate environment and sensor models and generate raw data and target lists (i.e. dSPACE Sensor Simulation PC)
A scalable PC-based platform that can support SIL test setups and high-volume cluster tests (i.e. dSPACE VEOS)
A real-time system for HIL test setups that offers integration options for various sensors (i.e. dSPACE SCALEXIO)
A radar test system or test bench that can perform over-the-air simulation of radar echoes (i.e. dSPACE DARTS)

You may also like:

  • qorvo
    Spectrum co-existence challenges in the fully connected car
  • GMR sensor
    How GMR wheel speed sensors help advance vehicle control and…
  • decision making
    Validating correct behavior in ADAS and AV systems
  • watch out dummy
    V2Clueless – Next generation of connected apps?

Filed Under: Applications, Automotive, FAQ, Featured Tagged With: dspace, FAQ

Reader Interactions

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Primary Sidebar

DesignFast

Component Selection Made Simple.

Try it Today
design fast globle

EE Training Center Classrooms

“ee

“ee

“ee

“ee

“ee

Subscribe to our Newsletter

Subscribe to weekly industry news, new product innovations and more.

Subscribe today

RSS Current EDABoard.com discussions

  • Bicycle antenna for long wave 200 khz
  • What's the format of Conformal LEC set_analyze_option -mapping_file file?
  • Relay with ACS712 overcurrent protection (schematic included)
  • Trying to fix PICkit3 boards
  • Problem in Process corner simulation using Cadence ADXL tool.

RSS Current Electro-Tech-Online.com Discussions

  • How to copy project in MPLAB X???
  • Drok meter Issue please help
  • Can anyone help?!
  • ST7066U 20x4 LCD problems
  • Power Transistor Voltage Source

Follow us on Twitter

Tweets by MicroContrlTips

Footer

Microcontroller Tips

EE World Online Network

  • DesignFast
  • EE World Online
  • EDA Board Forums
  • Electro Tech Online Forums
  • Connector Tips
  • Analog IC Tips
  • Power Electronic Tips
  • Sensor Tips
  • Test and Measurement Tips
  • Wire and Cable Tips
  • 5G Technology World

Microcontroller Tips

  • Subscribe to our newsletter
  • Advertise with us
  • Contact us
  • About us
Follow us on TwitterAdd us on FacebookFollow us on YouTube Follow us on Instagram

Copyright © 2021 · WTWH Media LLC and its licensors. All rights reserved.
The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media.

Privacy Policy