• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

Microcontroller Tips

Microcontroller engineering resources, new microcontroller products and electronics engineering news

  • Products
    • 8-bit
    • 16-bit
    • 32-bit
    • 64-bit
  • Applications
    • 5G
    • Automotive
    • Connectivity
    • Consumer Electronics
    • EV Engineering
    • Industrial
    • IoT
    • Medical
    • Security
    • Telecommunications
    • Wearables
    • Wireless
  • Learn
    • eBooks / Tech Tips
    • EE Training Days
    • FAQs
    • Learning Center
    • Tech Toolboxes
    • Webinars/Digital Events
  • Resources
    • Design Guide Library
    • DesignFast
    • LEAP Awards
    • Podcasts
    • White Papers
  • Videos
    • EE Videos & Interviews
    • Teardown Videos
  • EE Forums
    • EDABoard.com
    • Electro-Tech-Online.com
  • Engineering Training Days
  • Advertise
  • Subscribe

Simulations help explain why autonomous vehicles do stupid things

August 7, 2017 By Lee Teschler 2 Comments

Special test programs hope to help robotic systems make better decisions in short order.


Here’s a riddle: When is an SUV a bicycle? Answer: When it is a picture of a bicycle that is painted on the back of an SUV, and the thing looking at it is an autonomous vehicle.

edge_case_predictions
An edge case from a Cognata Ltd. simulation. If you were an autonomous vehicle, this would look like two cyclists instead of the back of an SUV.
Cyclists painted on the back of an SUV is what’s known in the autonomous-car industry as an “edge case.” This is a situation where autonomous system software understands an odd-ball scene differently from how humans would. The result of edge case scenarios is generally unpredictable behavior on the part of the robotically guided vehicle.

Edge cases like this one are the reason the Rand Corp. reported in 2016 that autonomous cars would need to be tested over 11 billion miles to prove that they’re better at driving than humans. With a fleet of 100 cars running 24 hours a day, that would take 500 years, Rand researchers say.

It’s not just of scenes painted on the back of vehicles that throw autonomous vehicles for a loop. “There are a lot of edge cases,” says Danny Atsmon, the CEO of autonomous vehicle simulation firm Cognata Ltd. “The classic example is that of driving at night after a rain. The pavement can be like a mirror, so you see a car and its reflection. Autonomous systems can interpret the scene as two different cars.”

Cognata, based in Israel, has a lot of experience with edge cases because it builds software simulators in which automakers can test autonomous-driving algorithms. The simulators allow developers to inject edge cases into driving simulations until the software can work out how to deal with them. This all happens in the lab without risking an accident.

“It can take months to hit an edge-case scenario in real road tests. In a simulation that’s not a problem,” says Atsmon.

Simulations like those that Cognata devises are also helpful because of the way autonomous systems recognize situations unfolding around them. Traditional object recognition techniques such as edge detection may be used to classify features such as lane dividers or road signs. But machine learning is the approach used to make decisions about what the vehicle sees.

RV edge case
To an autonomous vehicle system running a Cognata simulation, this doesn’t look like the back of an RV with a scene painted on. According to the color map (bottom), the autonomous software interprets the back of the RV as a weirdly shaped building dead ahead. Edge cases like this one help developers debug the machine learning algorithms that interpret driving situations.
Here learning algorithms handle image recognition. The feature detectors are so-called convolutional layers in software that can adapt to training data. To handle specific problem scenes, developers collect numerous training examples and choose parameters such as the number of layers in the learning network, the learning rate, the activation functions, and so forth. Eventually, the recognition system adapts its features to the given problem at hand. This approach works better than handcrafting features that may handle foreseen problems quite well but break for others.

To help developers of automated vehicle systems, Cognata recreates real cities such as San Francisco in 3D. Then it layers in data such as like traffic models from different cities to help gauge how vehicles drive and react. The simulations are detailed enough to factor in differences in driving habits of people in different cities, Atsmon says. The third layer of Cognata’s simulation is the emulation of the 40 or so sensors typically found on autonomous vehicles, including cameras, lidar and GPS. Cognata simulations run on computers that the auto manufacturer or Tier One supplier provides.

Sensor emulation is particularly important because autonomous cars overcome issues such as baffling images by fusing together information gathered from different types of sensing. Just as cameras can be fooled by images, lidar can’t sense glass and radar senses mainly metal, explains Atsmon. Autonomous systems learn to deal with complex situations by gradually figuring out which data can be used to correctly deal with particular edge cases.

Filed Under: Applications, Automotive Tagged With: cognata

Reader Interactions

Comments

  1. Jim F. says

    August 8, 2017 at 8:42 am

    The “obvious” answer is to have the image recognition software consider the “speed” of the object it “thinks” it is seeing. For example, the RV with the “building” is actually travelling at the speed of the rest of the traffic. A building moving at traffic speed is NOT a building. The same can be said for the two bikes pictured on the back of the other van. Another check could be to evaluate the relative position of the “building” to the edges of the truck. Again, a moving building is not a a “real” building.

    Reply
    • Lee Teschler says

      August 8, 2017 at 9:13 am

      OK, so now let’s talk about the reflection on wet pavement of the car in front of you. The reflection moves at the same speed as the real object. So much for considering the speed of the object the software thinks it is seeing. Situations like this show why everybody in the autonomous vehicle community is preaching sensor fusion and machine learning. Otherwise there are about a million special cases you have to think about in advance in order to avoid confusion.

      Reply

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Primary Sidebar

Featured Contributions

Five challenges for developing next-generation ADAS and autonomous vehicles

Securing IoT devices against quantum computing risks

RISC-V implementation strategies for certification of safety-critical systems

What’s new with Matter: how Matter 1.4 is reshaping interoperability and energy management

Edge AI: Revolutionizing real-time data processing and automation

More Featured Contributions

EE TECH TOOLBOX

“ee
Tech Toolbox: 5G Technology
This Tech Toolbox covers the basics of 5G technology plus a story about how engineers designed and built a prototype DSL router mostly from old cellphone parts. Download this first 5G/wired/wireless communications Tech Toolbox to learn more!

EE Learning Center

EE Learning Center

EE ENGINEERING TRAINING DAYS

engineering
“bills
“microcontroller
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest info on technologies, tools and strategies for EE professionals.

DesignFast

Design Fast Logo
Component Selection Made Simple.

Try it Today
design fast globle

Footer

Microcontroller Tips

EE World Online Network

  • 5G Technology World
  • EE World Online
  • Engineers Garage
  • Analog IC Tips
  • Battery Power Tips
  • Connector Tips
  • DesignFast
  • EDA Board Forums
  • Electro Tech Online Forums
  • EV Engineering
  • Power Electronic Tips
  • Sensor Tips
  • Test and Measurement Tips

Microcontroller Tips

  • Subscribe to our newsletter
  • Advertise with us
  • Contact us
  • About us

Copyright © 2025 · WTWH Media LLC and its licensors. All rights reserved.
The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media.

Privacy Policy