• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

Microcontroller Tips

Microcontroller engineering resources, new microcontroller products and electronics engineering news

  • Products
    • 8-bit
    • 16-bit
    • 32-bit
    • 64-bit
  • Applications
    • 5G
    • Automotive
    • Connectivity
    • Consumer Electronics
    • EV Engineering
    • Industrial
    • IoT
    • Medical
    • Security
    • Telecommunications
    • Wearables
    • Wireless
  • Learn
    • eBooks / Tech Tips
    • EE Training Days
    • FAQs
    • Learning Center
    • Tech Toolboxes
    • Webinars/Digital Events
  • Resources
    • Design Guide Library
    • DesignFast
    • LEAP Awards
    • Podcasts
    • White Papers
  • Videos
    • EE Videos & Interviews
    • Teardown Videos
  • EE Forums
    • EDABoard.com
    • Electro-Tech-Online.com
  • Engineering Training Days
  • Advertise
  • Subscribe

Sensor-realistic simulation in real time

July 30, 2021 By Caius Seiger, Dr. Gregor Sievers Leave a Comment

Advanced sensor simulation techniques can be used to validate functions for autonomous driving throughout the development process.

Caius Seiger, Dr. Gregor Sievers • dSPACE GmbH
Sensor-realistic simulation is a powerful way of validating sensor systems, which are an integral part of autonomous vehicles. Increasingly powerful computer systems make it possible to generate realistic sensor data in real time. This ability makes simulation efficient with several benefits for validating sensor control units.

sRGB
Illustration in sRGB format. (sRGB is a standard RGB color space created for use on monitors, printers, and the web. It is often the default color space for images that contain no color space information, especially if the image pixels are stored in 8-bit integers per color channel.) The color of each pixel is created from the base colors red, green, and blue. To create such an image, the raw data must pass through various processing steps. The camera detectors, which convert analog signals into digital, can detect only light. To gather information on the color spectrum, the detectors are equipped with color filters that let only a specific wavelength, and thus color, pass.

An important component for implementing autonomous driving according to SAE Level 5 involves capturing the vehicle environment via environment sensors. Manufacturers use different sensor types–such as cameras, radar, lidar, and ultrasonic sensors–for this purpose. Complex algorithms then merge the sensor data in high-performance processing units and use the results to make decisions.

Thus it’s crucial to validate the algorithms for fusion and perception, and those for the overall system. Various validation methods are available. Test drives make it possible to validate the entire autonomous vehicle, but they cover only a few critical situations and are relatively expensive.

An industry-proven method for the validation of driver assistance system algorithms is to play back recorded sensor data. For this method, a fleet of specially prepared vehicles is equipped with sensors. The vast volumes of data generated must be stored in powerful in-vehicle data logging systems and transferred to the cloud. The data is then evaluated, anonymized, tagged with terms for better retrieval, and labeled. The labeling is time-consuming and only partly automatable.

highway scene
Illustration of a highway scene from a blue-green-green-red image sensor (Bayer sensor) that corresponds to the relevant raw data format. The data can then be edited to create a human-readable format. Importing an sRGB image into an ECU causes an error if the interface is designed for raw data.

The recorded data is then stored in a way making it usable for testing during the development process and for release tests. One problem: This storage involves a great deal of time and effort and does not allow for changes in the sensor setup. If the next generation of vehicles is equipped with new sensors, they’ll require more test drives. Another drawback is that unforeseeable, rare events are difficult if not impossible to recreate.

Software-in-the-loop (SIL) and hardware-in-the-loop (HIL) simulation make it possible to test critical traffic scenarios. They can run through a nearly infinite combination of parameters including weather conditions, lens effects, and fault simulation for sensors.

Sensor-realistic simulation

Simulation has obvious advantages: It lets users configure all relevant components from the vehicle and sensor parameters to the driving maneuvers. And tricky traffic scenarios can be safely reproduced.

One of the main challenges of simulation is calculating the realistic sensor data in real time. For driver assistance systems, it is often sufficient to use sensor-independent object lists based on ground truth data (that is, data about the environment not from the vehicle sensors). The object lists are easy to extract from the traffic simulation. In contrast, autonomous vehicles process the raw data captured by a sensor front end in a central control unit. Calculation of the raw sensor data is much more time-consuming because it is based on the physical properties of each sensor. The raw data format is different for each sensor.

When inputting synthetic camera data, it is important to both create subjectively realistic images for the human user and to input the data at the right moment. The data must also be realistically generated for the sensor.

A graphics card is used to compute the raw data in real time because it can process more data in parallel than the main processor. This fact becomes clear from a closer look at radar and lidar sensors, because their metrological computation requires complex ray tracing. For example, suppose radar sensor waves reflect back to the sensor via the guard rail and from another object. Here, the radar detects a ghost target and adds it to the detection list, though the object does not exist. These types of operations are complex but can easily be parallelized so the calculated sensor data can be input into the ECU in real time.

HIL test for sensor fusion controllers

HIL test benches make it possible to test real ECUs in the lab by stimulating them with recorded or synthetic data. Consider an example setup for open- and closed-loop HIL simulation as well as raw data input for a front camera. Here, the camera image sensor, including the lens, is replaced and simulated by the HIL environment.

HIL
Example setup for open- and closed-loop HIL simulation and raw data injection.

The dSPACE traffic and vehicle dynamics simulation is executed on a real-time PC with an update interval of 1 msec. In addition, the real-time PC for the restbus simulation (Restbus simulation is a technique used to validate ECU functions by simulating parts of an in-vehicle bus such as the controller area network.) is connected to the vehicle network (CAN, Ethernet, FlexRay, etc.).

The results of the vehicle simulation are transferred to a powerful computer. The computer then generates a three-dimensional representation of the environment. The relevant, parameterized sensor models are calculated on the basis of this representation. Simulation and testing providers, such as dSPACE, can furnish these sensor models.

As an alternative, users can integrate sensor models from Tier 1 suppliers via the open simulation interface (OSI) which also protects supplier intellectual property. Furthermore, dSPACE supports standards such as OpenDrive for defining streets and OpenScenario as a format for defining scenarios.

The raw sensor data is transferred to the dSPACE Environment Sensor Interface Unit (ESI Unit) via the DisplayPort interface of the GPU. This FPGA-based platform executes all remaining parts of the sensor models. For example, it executes light control or simulates the I2C interface of the image sensor.

So far, no supplier-independent interface standard has been established for transferring raw sensor data. So a test system for raw sensor data injection must support a wide range of sensor interfaces. The ESI Unit is highly modular and supports all relevant automotive sensor interfaces. Automotive cameras typically use TI FPD-Link III and IV, Maxim GMSL1 and GMSL2, and MIPI CSI-2 with more than 8 Gbit/sec. Most radar and lidar sensors have an automotive Ethernet interface with up to 10 Gbit/sec. However, the interfaces used for cameras are also increasingly used for radar and lidar.

The HIL simulation of autonomous vehicles with dozens of sensors presents a particular challenge. It takes a great deal of processing power (CPU, GPU, FPGA) to realistically simulate all the sensors. In addition, the sensor and bus data must be synchronized depending on the vehicle and sensor architecture.

In the case of (rest)bus data, a real-time operating system, such as dSpace SCALEXIO, performs these tasks across multiple computation nodes. The sensor simulation at the raw data level requires both GPUs and FPGAs and thus calls for new synchronization concepts. Furthermore, all components of the simulator setup must be optimized for low end-to-end latencies so the control algorithms of the ECU can be tested.

All in all, sensor simulation is a powerful, integrated, and uniform way of validating autonomous vehicles. The dSPACE solution described here ensures a high level of productivity in the validation of sensor-based ECUs at all stages of the development and test process.

You may also like:

  • router security
    Worst suspicions confirmed: The terrible security of internet routers
  • BLE hacks
    Breaking BLE — Vulnerabilities in pairing protocols leave Bluetooth devices…
  • RF won't hurt you
    No, IoT RF radiation won’t cause a pandemic
  • lidar
    A better way to measure LiDAR
  • flash
    Flash memory keeps cars connected

Filed Under: Applications, Automotive, FAQ, Featured, Software Tagged With: dspace, FAQ

Reader Interactions

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Primary Sidebar

Featured Contributions

Engineering harmony: solving the multiprotocol puzzle in IoT device design

What’s slowing down Edge AI? It’s not compute, it’s data movement

Five challenges for developing next-generation ADAS and autonomous vehicles

Securing IoT devices against quantum computing risks

RISC-V implementation strategies for certification of safety-critical systems

More Featured Contributions

EE TECH TOOLBOX

“ee
Tech Toolbox: Embedded Security
This Tech ToolBox highlights the innovations driving embedded systems, as well as the strategies needed to build resilient, future-ready technologies.

EE Learning Center

EE Learning Center

EE ENGINEERING TRAINING DAYS

engineering
“bills
“microcontroller
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest info on technologies, tools and strategies for EE professionals.

DesignFast

Design Fast Logo
Component Selection Made Simple.

Try it Today
design fast globle

Footer

Microcontroller Tips

EE World Online Network

  • 5G Technology World
  • EE World Online
  • Engineers Garage
  • Analog IC Tips
  • Battery Power Tips
  • Connector Tips
  • DesignFast
  • EDA Board Forums
  • Electro Tech Online Forums
  • EV Engineering
  • Power Electronic Tips
  • Sensor Tips
  • Test and Measurement Tips

Microcontroller Tips

  • Subscribe to our newsletter
  • Advertise with us
  • Contact us
  • About us

Copyright © 2025 · WTWH Media LLC and its licensors. All rights reserved.
The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media.

Privacy Policy