Artificial Intelligence calls for Smart Interconnect
Artificial Intelligence based systems are driving a metamorphosis in computing, and consequently precipitating a large shift in SOC design. AI training is often done in the cloud and has requirements for handling huge amounts of data with forward and backward data connections. Inference usually occurs at the edge and must be power efficient and fast. Each of these imposes new requirements on computing systems. Training puts a premium on throughput and inference relies on low latency, especially for real time applications like ADAS.
To accommodate these new requirements, there are sweeping changes occurring in computational architectures. In much the same way that mini- and then micro- computers changed the landscape of computing, the changes necessitated to support AI will permanently alter how things are done.
To read the full article, click here
Related Blogs
- Is Formal Verification Artificial Intelligence?
- Datapath Validation - Solving Verification Challenges in the Era of Artificial Intelligence and Mathematical Cores
- 2024 Set The Stage For NoC Interconnect Innovations In SoC Design
- Real-Time Intelligence for Physical AI at the Edge
Latest Blogs
- Rambus Announces Industry-Leading Ultra Ethernet Security IP Solutions for AI and HPC
- The Memory Imperative for Next-Generation AI Accelerator SoCs
- Leadership in CAN XL strengthens Bosch’s position in vehicle communication
- Validating UPLI Protocol Across Topologies with Cadence UALink VIP
- Cadence Tapes Out 32GT/s UCIe IP Subsystem on Samsung 4nm Technology