Five AI Inference Trends for 2022
By Dana McCarty, Flex Logic (EETimes, January 12, 2022)
It’s an exciting time to be a part of the rapidly growing AI industry, particularly in the field of inference. Once relegated simply to high-end and outrageously expensive computing systems, AI inference has been marching towards the edge at super-fast speeds. Today, customers in a wide range of industries – from medical, industrial, robotics, security, retail and imaging – are either evaluating or actually designing AI inference capabilities into their products and applications.
Fortunately, with the advent of new semiconductor devices developed specifically to accelerate AI workloads, this technology has now advanced to the point where many products have dropped to price points and form factors that make it viable for mainstream markets where AI can be incorporated into a wide range of systems.
As we look to 2022, here are our predicted AI inference trends.
Related Semiconductor IP
- AES GCM IP Core
- High Speed Ethernet Quad 10G to 100G PCS
- High Speed Ethernet Gen-2 Quad 100G PCS IP
- High Speed Ethernet 4/2/1-Lane 100G PCS
- High Speed Ethernet 2/4/8-Lane 200G/400G PCS
Related News
- Sagence AI Emerges from Stealth Tackling Economic Viability of Inference Hardware for Generative AI
- Five intellectual property trends to watch in 2013
- Google Cloud Delivers Customized Silicon Powered by Arm Neoverse for General-Purpose Compute and AI Inference Workloads
- AI Edge Inference IP Leader Expedera Opens R&D Office in India
Latest News
- HPC customer engages Sondrel for high end chip design
- PCI-SIG’s Al Yanes on PCIe 7.0, HPC, and the Future of Interconnects
- Ubitium Debuts First Universal RISC-V Processor to Enable AI at No Additional Cost, as It Raises $3.7M
- Cadence Unveils Arm-Based System Chiplet
- Frontgrade Gaisler Unveils GR716B, a New Standard in Space-Grade Microcontrollers