• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

Microcontroller Tips

Microcontroller engineering resources, new microcontroller products and electronics engineering news

  • Products
    • 8-bit
    • 16-bit
    • 32-bit
    • 64-bit
  • Applications
    • 5G
    • Automotive
    • Connectivity
    • Consumer Electronics
    • EV Engineering
    • Industrial
    • IoT
    • Medical
    • Security
    • Telecommunications
    • Wearables
    • Wireless
  • Learn
    • eBooks / Tech Tips
    • EE Training Days
    • FAQs
    • Learning Center
    • Tech Toolboxes
    • Webinars/Digital Events
  • Resources
    • Design Guide Library
    • DesignFast
    • LEAP Awards
    • Podcasts
    • White Papers
  • Videos
    • EE Videos & Interviews
    • Teardown Videos
  • EE Forums
    • EDABoard.com
    • Electro-Tech-Online.com
  • Engineering Training Days
  • Advertise
  • Subscribe

Measuring the performance of AI and its impact on society – Virtual Conversation (part 2 of 2)

October 13, 2020 By Jeff Shepard Leave a Comment

Hosted by Jeff Shepard, EE World has organized this “virtual conversation” with Gary Bronner (GB), Senior Vice President with Rambus Labs. Mr. Bronner has generously agreed to share with us his experience and insights into AI applications and emerging computing architectures.

JS: When benchmarking AI performance, how would you rank the importance of throughput, latency, and accuracy?

GB: There really isn’t a one size fits all answer to this. Those things are valued very differently, depending on the specific application. It also depends upon whether you are looking at training or inference.

Gary Bronner (GB), Senior Vice President, Rambus Labs

For training, throughput is key. For inference, normally, one wants excellent latency, which makes it difficult to optimize for throughput. This is normally a trade-off that must be made. Training is all about collecting a bunch of things and then running them through together, which is more efficient from a throughput perspective but gives a longer latency. Inference is all about looking at a sample of one and quickly making a decision, which implies low latency.

The third important measurement to keep in mind is accuracy. You can improve throughput and latency but actually dramatically hurt accuracy. For example, I have seen situations where someone has built a really bad model. It doesn’t have very much computation at all. It’s not going to require very much data, and it’s not going to be very accurate, but I’m going to be able to get through it very quickly. And it is going to be very low latency to do the calculations. But the results are not acceptable. One needs to find an acceptable accuracy level for one’s application. It is a tough tradeoff since accuracy also affects energy. There is more power available in a data center than on a phone so that accuracy can be much higher.

JS: How is the perception of AI’s environmental impact changing, so-called “red AI”, and how might that perception change future developments in AI?

The environmental impact of the world’s collective “computing” is hugely important. According to results published in 2018, the world’s data centers consume ~ 1% of the world’s energy. To prevent this number from growing bigger, special attention will need to be paid. In the cloud, AI applications are immensely power-hungry, which makes efficiency even more important. And it’s expected that more and more AI will be done at the edge where many devices are battery-powered. As a result, I would expect to see a move away from “red AI” except for the most critical applications where power consumption trumps efficiency – for example, in the race to find a vaccine for COVID-19.

HBM2E is a high-performance memory that delivers higher overall throughput at a higher bandwidth-per-watt efficiency ideal for AI/ML and high-performance computing (HPC) applications. (Image: Rambus Labs)

JS: What are the societal implications, if any, of bias that may exist in AI inference engines?

GB: This isn’t a hardware issue, and so as a hardware company, this falls far outside Rambus’ purview. This is a challenge that needs to be addressed by the people coding and deploying the AI algorithms.

JS: Where do you expect AI and cognitive computing to have the largest impact in the near term? In the longer term?

GB: AI has made the most progress in deep learning applications, which are based on pattern matching. These are the applications where a lot of data can be learned and recognized by a computer. Such applications are already starting to have a large impact with significant positive results. For example, with Microsoft’s translation app, you can talk to your phone, and it will speak in a different language.

In the longer term, if we could get to something that is more towards the direction of what’s called artificial general intelligence (AGI), meaning intelligence, more like a human, I think that would have a very large impact. But it is very unclear when we are actually going to get there. Discussion in the community puts it in a range anywhere from five to a few hundred years.

JS: Thank you to Mr. Bronner, for sharing his insights and experience, another great conversation! You might also be interested in reading “AI applications and emerging computing architectures” – Virtual Conversation (part 1 of 2).

You may also like:


  • AI applications and emerging computing architectures – Virtual Conversation (part…
  • green artificial intelligence
    How “green” is your Artificial Intelligence?

  • AI, machine learning, gesture-control software now tied to ARM Cortex…
  • artificial intelligence benchmarks
    Benchmarking AI from the edge to the cloud
  • artificial intelligence
    Artificial Intelligence, Machine Learning, Deep Learning, and Cognitive Computing

Filed Under: Artificial intelligence, FAQ, Featured Tagged With: FAQ

Reader Interactions

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Primary Sidebar

Featured Contributions

Five challenges for developing next-generation ADAS and autonomous vehicles

Securing IoT devices against quantum computing risks

RISC-V implementation strategies for certification of safety-critical systems

What’s new with Matter: how Matter 1.4 is reshaping interoperability and energy management

Edge AI: Revolutionizing real-time data processing and automation

More Featured Contributions

EE TECH TOOLBOX

“ee
Tech Toolbox: 5G Technology
This Tech Toolbox covers the basics of 5G technology plus a story about how engineers designed and built a prototype DSL router mostly from old cellphone parts. Download this first 5G/wired/wireless communications Tech Toolbox to learn more!

EE Learning Center

EE Learning Center

EE ENGINEERING TRAINING DAYS

engineering
“bills
“microcontroller
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest info on technologies, tools and strategies for EE professionals.

DesignFast

Design Fast Logo
Component Selection Made Simple.

Try it Today
design fast globle

Footer

Microcontroller Tips

EE World Online Network

  • 5G Technology World
  • EE World Online
  • Engineers Garage
  • Analog IC Tips
  • Battery Power Tips
  • Connector Tips
  • DesignFast
  • EDA Board Forums
  • Electro Tech Online Forums
  • EV Engineering
  • Power Electronic Tips
  • Sensor Tips
  • Test and Measurement Tips

Microcontroller Tips

  • Subscribe to our newsletter
  • Advertise with us
  • Contact us
  • About us

Copyright © 2025 · WTWH Media LLC and its licensors. All rights reserved.
The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media.

Privacy Policy