GPUs Dominate AI Compute, FPGAs Move Into the AI Data Path

As AI moves beyond data centers, system constraints are redefining where reconfigurable hardware belongs.

By Yashasvini Razdan, EE Times | January 20, 2026

FPGAs offer programmable and flexible hardware design but require longer design cycles than CPU- or GPU-based systems. As AI workloads scale and push for higher compute density, faster time to deployment, and lower energy per operation in latency- and control-bound workloads, do FPGAs still make sense in the AI stack?

In an interview with EE Times, Esam Elashmawi, chief strategy and marketing officer at Lattice Semiconductor, said FPGAs do not compete with GPUs for AI compute but instead operate as companion devices in the data path, particularly at the edge. “If you need very high performance and you are willing to live with high power, then you can use a GPU or a CPU,” he said. “FPGAs are a good companion to it.”

To read the full article, click here

×
Semiconductor IP