Artificial intelligence (AI) and machine learning (ML) continue to push the limits of conventional semiconductor architectures. To increase speeds, lower latency, and optimize power consumption for high-performance workloads, semiconductor companies and research institutions are developing advanced photonic chips that operate on the principles of light rather than electrical currents. This article discusses the limitations of […]
FAQ
What’s new with Matter: how Matter 1.4 is reshaping interoperability and energy management
Since its initial launch in 2022, Matter has led the charge on making smart homes smarter – allowing devices from different manufacturers and ecosystems, like Apple Home and Amazon Alexa, to communicate seamlessly. The open-source, IP-based protocol has become a key player in the push for device interoperability, aiming to create a reliable user experience […]
How to minimize design cycles for AI accelerators with advanced DFT and silicon bring-up
Design for testability (DFT) embeds testable features into an integrated circuit (IC) during design, while silicon bring-up initiates chip evaluation and debugging. Streamlining these sequential processes minimizes design cycles and shortens time-to-market (TTM) for advanced artificial intelligence (AI) accelerators. This article explores the complexities of AI chip design and outlines strategies for optimizing DFT and […]
How to test IoT device wireless capabilities
Testing the wireless capabilities of IoT devices before production helps ensure reliable connectivity, optimal performance, and seamless integration across a wide range of applications. This article reviews key IoT wireless parameters and highlights Wi-Fi, 5G, and Bluetooth LE testing priorities. It also discusses how wireless testing supports optimal LPWAN and ZigBee IoT device functionality. Analyzing […]
How to integrate theft-prevention tracking capabilities in IoT devices
Microcontrollers and wireless modules combine to add connectivity and theft-prevention features to IoT devices for automotive, industrial, medical, and smart-home use. Many original equipment manufacturers (OEMs) integrate theft-prevention tracking capabilities into their IoT devices. This article reviews key trade-offs and considerations for selecting wireless tracking technologies and highlights the crucial role microcontroller units (MCUs) play […]
How does a zero trust security architecture work?
Zero trust architectures (ZTAs) are a reaction to the emergence of cloud computing, remote work, and bringing your own device (BYOD) into enterprise networks. Those trends result in networks not completely contained within an enterprise-owned boundary, significantly complicating network security needs. This article briefly reviews the purpose and structure of ZTAs, looks at some relevant […]
What can be done to prepare for post quantum cryptography?
Post-quantum cryptography (PQC), also called quantum-safe cryptography, uses mathematical algorithms to create security environments resistant to quantum computer attacks. PQC is a rapidly evolving technology, and preparing for its adoption requires keeping up with developments in several areas. As part of the formalization of PQC algorithms, NIST has finalized the algorithms and changed their names […]
What’s new in the NIST CSF 2.0 framework?
The NIST Cybersecurity Framework (CSF) 2.0 is the first major update since its creation in 2014. It was finalized in early 2024. This article reviews some of the important changes in CSF 2.0 compared with v1.1, briefly examines how v2.0 improves alignment with other cybersecurity standards and frameworks and compares CSF 2.0 with the ISO […]
What are the elements of secure boot processes?
The secure boot protects devices from unauthorized modification by verifying the authenticity of the boot code. Its importance is growing as society digitalizes and the number of devices on the Internet of Things (IoT) soars. A secure boot prevents an adversary from compromising device operation. This article reviews the basic elements of secure boot processes, […]
How are AI and ML used for advanced threat detection?
The increasing number of threat vectors and the growing size of the attack surface in today’s communication and computer networks demand more powerful and faster threat detection. Legacy tools are no longer adequate. To ensure cybersecurity, high-speed threat detection based on artificial intelligence (AI) and machine learning (ML) is increasingly being deployed. This article reviews […]