How embedded FPGAs fit AI applications
June 18, 2018 // By Alok Sanghavi, Achronix Semiconductor Corp.
Artificial intelligence, and machine learning in particular, is reshaping the way the world works, opening up countless opportunities in industry and commerce, but the optimum hardware architecture to support neural network evolution, diversity, training and inferencing is not determined. Alok Sanghavi surveys the landscape and makes the case for embedded FPGAs.
Applications span diverse markets such as autonomous driving, medical diagnostics, home appliances, industrial automation, adaptive websites, financial analytics and network infrastructure.
These applications, especially when implemented on the edge, demand high performance and, low latency to respond successfully to real-time changes in conditions. They also require low power consumption, rendering energy-intensive cloud-based solutions unusable. A further requirement is for these embedded systems to always be on and ready to respond even in the absence of a network connection to the cloud. This combination of factors calls for a change in the way that hardware is designed.
To read the full article, click here
Related Semiconductor IP
- eFPGA
- Heterogeneous eFPGA architecture with LUTs, DSPs, and BRAMs on GlobalFoundries GF12LP
- eFPGA on GlobalFoundries GF12LP
- eFPGA Hard IP Generator
- Radiation-Hardened eFPGA
Related White Papers
- An FPGA design flow for video imaging applications
- How to exploit the uniqueness of FPGA silicon for security applications
- Increasing bandwidth in industrial applications with FPGA co-processors
- Managing power in embedded applications using dual operating systems