• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

Microcontroller Tips

Microcontroller engineering resources, new microcontroller products and electronics engineering news

  • Products
    • 8-bit
    • 16-bit
    • 32-bit
    • 64-bit
  • Applications
    • Automotive
    • Connectivity
    • Consumer Electronics
    • Industrial
    • Medical
    • Security
  • EE Forums
    • EDABoard.com
    • Electro-Tech-Online.com
  • Videos
    • TI Microcontroller Videos
  • EE Resources
    • DesignFast
    • eBooks / Tech Tips
    • FAQs
    • LEAP Awards
    • Podcasts
    • Webinars
    • White Papers
  • EE Learning Center

A better way to measure lidar

August 16, 2019 By Lee Teschler Leave a Comment

The special demands of sensors needed for autonomous driving have brought suggestions for different measurement metrics.

Leland Teschler, Executive Editor
A few months ago, sensor maker AEye made a media splash when it claimed it had developed a super-speedy lidar sensor that could hit a scan rate of 100 Hz or more. That speed outpaces by about a factor of 10 what ordinary lidar sensors can produce. So you might think AEye would do a lot of crowing about its new system.

But actually, AEye says scan rates aren’t a particularly good measure of lidar performance, or at least not as good as other metrics, when the application area is autonomous vehicles. The crux of AEye’s argument about metrics is that traditional measures generally sized up factors that were strictly determined by the optical hardware, control electronics, and any motion-control mechanics involved. Typical examples of these measures include frame rate, resolution, and range.

The problem, says AEye, is that these factors don’t tell much about how a lidar system in an autonomous vehicle behaves. The reason is that lidar systems in vehicles actually take on tasks involved in not just sensing but also object recognition. So measurements that just look at the hardware and the electronics miss a big part of what has come to be expected from vehicular lidar systems.

ORR
As explained by AEye, simple lidar sensors follow a preset scan pattern. More advanced units can now concentrate on areas of the scene containing objects of interest, and do so within the same amount of time that simple systems need for a fixed scan.

Instead, AEye suggests borrowing a few ideas from advanced robotic vision research. Specifically, they say robotic work around object identification that connects search, acquisition, and action points the way toward better automotive lidar metrics. Here, search is the ability to detect objects without missing any of them. Acquisition is the process of taking search results and classifying the objects in the search. Action refers to things the sensor does (not things the car does) in response to what it perceives. Typical sensor actions might be to scan for new objects, get more info about a specific object, track an object that seems to be non-threatening, and tell the control system to take evasive action.

However, those familiar with the use of lidar in autonomous vehicle schemes might argue that AEye’s new metrics are a bit self-serving: AEye makes a lidar system called iDAR which looks pretty good when evaluated with the metrics that AEye proposes. Nevertheless, the reasoning behind new lidar metrics is that the most important qualities of these systems can’t be judged by the metrics of the lidar sensor alone, a point upon which few would disagree.

AEye says a key factor missing from the conventional frame rate metric is a fine definition of time, because the time it takes to detect an object depends on multiple variables besides the scan rate which include object shape, distance, reflectivity, and several others.

Instead, the company proposes an object revisit rate (ORR) metric. This is basically a measure of how fast the system can take a second look at an object it has decided is “interesting.” The rationale for ORR is that advanced recognition algorithms that see multiple moving objects can get confused if a lot of time elapses between scene samples. Increasingly the practice is to schedule repeated shots at interesting objects in quick succession, fewer at things that aren’t as important.

In simple systems, the lidar frame rate is the same as the ORR. But AEye says in more advanced setups that can change what they look at dynamically, some parts of the scene can be examined 1,000x faster than what’s possible with passive sensors.

In the same vein, AEye says object detection range is an obsolete metric for lidar systems. What matters most is not how quickly an object be detected, but how quickly a system can ID it, make a threat-level decision, and decide on an appropriate response. Currently, the perception software stack usually classifies and IDs incoming object data, which then gets used to predict trajectories.

AEye argues that a metric called object classification range (OCR) is a better way to measure things because it reduces unknowns such as latency from noise suppression early in the perception stack, pinpointing salient information.

All in all, AEye points out that conventional LiDAR sensors passively search with scan patterns that are fixed in both time and space. But intelligent sensors are increasingly moving to active search and identification of classification attributes of objects in real time. As perception technology and sensor systems evolve, so too should the metrics used to measure their capabilities.

You may also like:

  • auto EE architectures
    What high-speed data means for connected vehicles
  • flash
    Flash memory keeps cars connected
  • dSpace
    Data injection — Testing autonomous sensors through realistic simulation
  • qorvo
    Spectrum co-existence challenges in the fully connected car

Filed Under: Applications, Automotive, Connectivity, FAQ, Featured Tagged With: AEye

Reader Interactions

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Primary Sidebar

DesignFast

Component Selection Made Simple.

Try it Today
design fast globle

EE Training Center Classrooms

“ee

“ee

“ee

“ee

“ee

Subscribe to our Newsletter

Subscribe to weekly industry news, new product innovations and more.

Subscribe today

RSS Current EDABoard.com discussions

  • Requirement of a 3-D vertically stacking RGB LED
  • Help with simple op amp circuit
  • Strange ATmega328 problem
  • Using the HFSS feature mode solver for an open structure
  • Understanding the phenomena triangle wave using event vs interrupts

RSS Current Electro-Tech-Online.com Discussions

  • Pic32 Not Working with 18.432MHz Crystal
  • Momentary push button DPDT switch with alternate action
  • ST7066U 20x4 LCD problems
  • Help needed with Soft Start Circuit for 12V High Current Load
  • How to get an esp32-S2 recognized as USB device on a mac

Follow us on Twitter

Tweets by MicroContrlTips

Footer

Microcontroller Tips

EE World Online Network

  • DesignFast
  • EE World Online
  • EDA Board Forums
  • Electro Tech Online Forums
  • Connector Tips
  • Analog IC Tips
  • Power Electronic Tips
  • Sensor Tips
  • Test and Measurement Tips
  • Wire and Cable Tips
  • 5G Technology World

Microcontroller Tips

  • Subscribe to our newsletter
  • Advertise with us
  • Contact us
  • About us
Follow us on TwitterAdd us on FacebookFollow us on YouTube Follow us on Instagram

Copyright © 2021 · WTWH Media LLC and its licensors. All rights reserved.
The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media.

Privacy Policy