The benefit of non-volatile memory (NVM) for edge AI
Eran Briman, Weebit Nano
embedded.com (September 15, 2023)
In low-power IoT and edge AI applications, AI models can be small enough to fit into the internal NVM of an SoC. The on-chip NVM could be used for both code storage and to hold the AI weights and CPU firmware.
Ongoing innovation in semiconductor technologies, algorithms and data science are making it possible to incorporate some degree of AI inferencing capability in an increasing number of edge devices. Today we see it in computer vision applications like object recognition, facial recognition, and image classification on products from phones and laptops to security cameras. In industrial systems, inferencing enables predictive equipment maintenance and lets robots perform tasks independently. For IoT and smart home products, AI inference makes it possible to monitor and respond in real time to various sensor inputs.
The lowest cost processing solutions that support AI inferencing today are off-the-shelf single-chip microcontrollers used for IoT systems. Such chips combine a general-purpose CPU, SRAM and IO functions with non-volatile memory (NVM). However, these chips implement the AI algorithms in software running on the CPU which can deliver only modest performance and are only practical for basic inference. Scaling a single-chip solution to provide higher performance inference presents a challenge to designers.
Related Semiconductor IP
Related White Papers
- The Growing Importance of AI Inference and the Implications for Memory Technology
- The Expanding Markets for Edge AI Inference
- MIPI in next generation of AI IoT devices at the edge
- How Efinix is Conquering the Hurdle of Hardware Acceleration for Devices at the Edge
Latest White Papers
- New Realities Demand a New Approach to System Verification and Validation
- How silicon and circuit optimizations help FPGAs offer lower size, power and cost in video bridging applications
- Sustainable Hardware Specialization
- PCIe IP With Enhanced Security For The Automotive Market
- Top 5 Reasons why CPU is the Best Processor for AI Inference